Could not get scheduler_job_id
See original GitHub issueApache Airflow version:
2.0.0
Kubernetes version (if you are using kubernetes) (use kubectl version):
1.18.3
Environment:
Cloud provider or hardware configuration: AWS
What happened:
When trying to run a DAG, it gets scheduled, but task is never run. When attempting to run task manually, it shows an error:
Something bad has happened.
Please consider letting us know by creating a bug report using GitHub.
Python version: 3.8.7
Airflow version: 2.0.0
Node: airflow-web-ffdd89d6-h98vj
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/usr/local/lib/python3.8/site-packages/airflow/www/auth.py", line 34, in decorated
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/www/decorators.py", line 60, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/www/views.py", line 1366, in run
executor.start()
File "/usr/local/lib/python3.8/site-packages/airflow/executors/kubernetes_executor.py", line 493, in start
raise AirflowException("Could not get scheduler_job_id")
airflow.exceptions.AirflowException: Could not get scheduler_job_id
What you expected to happen:
The task to be run successfully without
How to reproduce it:
Haven’t pinpointed what causes the issue, besides an attempted upgrade from Airflow 1.10.14 to Airflow 2.0.0
Anything else we need to know:
This error is encountered in an upgrade of Airflow from 1.10.14 to Airflow 2.0.0
EDIT: Formatted to fit the issue template
Issue Analytics
- State:
- Created 3 years ago
- Reactions:5
- Comments:36 (20 by maintainers)
Top Results From Across the Web
Why job id is not created with DBMS_SCHEDULER ...
There's no job_id for jobs created through dbms_scheduler. There's an id though, but it's the object_id of the job. You can find it...
Read more >Airflow Scheduler not executing scheduled jobs and no log ...
Restart Airflow scheduler, copy DAGs back to folders and wait. DAGs appear back in GUI and database. Scheduled tasks will still show up...
Read more >Source code for airflow.executors.kubernetes_executor
It will then create a unique job-id, launch that job in the cluster, ... if not base_worker_pod: raise AirflowException( f"could not find a...
Read more >Isilon: Scheduled job <Job-Name> will not be started as ... - Dell
1. Open an SSH connection to any node in the cluster using the root account. · 2. Run : Cluster-1# isi job status...
Read more >Troubleshooting Apache Flink jobs - IBM
If the issue happens after you have updated your IBM Business Automation Insights configuration, the problem might indicate that Apache Flink did not...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I am facing the same error while trying to backfill: Logs:
I also encounter this problem, but curious about another questions, why load same DAG three times before the exception happen, and the last two of loaded DAG path seems incorrect.