question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Airflow 2.3 scheduler error: 'V1Container' object has no attribute '_startup_probe'

See original GitHub issue

Apache Airflow version

2.3.0 (latest released)

What happened

After migrating from Airflow 2.2.4 to 2.3.0 scheduler fell into crash loop throwing:

--- Logging error ---
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/scheduler_job.py", line 736, in _execute
    self._run_scheduler_loop()
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/scheduler_job.py", line 826, in _run_scheduler_loop
    self.executor.heartbeat()
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/executors/base_executor.py", line 171, in heartbeat
    self.sync()
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/executors/kubernetes_executor.py", line 613, in sync
    self.kube_scheduler.run_next(task)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/executors/kubernetes_executor.py", line 300, in run_next
    self.log.info('Kubernetes job is %s', str(next_job).replace("\n", " "))
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_pod.py", line 214, in __repr__
    return self.to_str()
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_pod.py", line 210, in to_str
    return pprint.pformat(self.to_dict())
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_pod.py", line 196, in to_dict
    result[attr] = value.to_dict()
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_pod_spec.py", line 1070, in to_dict
    result[attr] = list(map(
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_pod_spec.py", line 1071, in <lambda>
    lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_container.py", line 672, in to_dict
    value = getattr(self, attr)
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_container.py", line 464, in startup_probe
    return self._startup_probe
AttributeError: 'V1Container' object has no attribute '_startup_probe'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.9/logging/__init__.py", line 1083, in emit
    msg = self.format(record)
  File "/usr/local/lib/python3.9/logging/__init__.py", line 927, in format
    return fmt.format(record)
  File "/usr/local/lib/python3.9/logging/__init__.py", line 663, in format
    record.message = record.getMessage()
  File "/usr/local/lib/python3.9/logging/__init__.py", line 367, in getMessage
    msg = msg % self.args
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_pod.py", line 214, in __repr__
    return self.to_str()
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_pod.py", line 210, in to_str
    return pprint.pformat(self.to_dict())
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_pod.py", line 196, in to_dict
    result[attr] = value.to_dict()
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_pod_spec.py", line 1070, in to_dict
    result[attr] = list(map(
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_pod_spec.py", line 1071, in <lambda>
    lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_container.py", line 672, in to_dict
    value = getattr(self, attr)
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_container.py", line 464, in startup_probe
    return self._startup_probe
AttributeError: 'V1Container' object has no attribute '_startup_probe'
Call stack:
  File "/home/airflow/.local/bin/airflow", line 8, in <module>
    sys.exit(main())
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/__main__.py", line 38, in main
    args.func(args)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/cli_parser.py", line 51, in command
    return func(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/utils/cli.py", line 99, in wrapper
    return f(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/commands/scheduler_command.py", line 75, in scheduler
    _run_scheduler_job(args=args)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/commands/scheduler_command.py", line 46, in _run_scheduler_job
    job.run()
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/base_job.py", line 244, in run
    self._execute()
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/scheduler_job.py", line 757, in _execute
    self.executor.end()
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/executors/kubernetes_executor.py", line 809, in end
    self._flush_task_queue()
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/executors/kubernetes_executor.py", line 767, in _flush_task_queue
    self.log.warning('Executor shutting down, will NOT run task=%s', task)
Unable to print the message and arguments - possible formatting error.
Use the traceback above to help find the error.

kubernetes python library version was exactly as specified in constraints file: https://raw.githubusercontent.com/apache/airflow/constraints-2.3.0/constraints-3.9.txt

What you think should happen instead

Scheduler should work

How to reproduce

Not 100% sure but:

  1. Run Airflow 2.2.4 using official Helm Chart
  2. Run some dags to have some records in DB
  3. Migrate to 2.3.0 (replace 2.2.4 image with 2.3.0 one)

Operating System

Debian GNU/Linux 11 (bullseye)

Versions of Apache Airflow Providers

irrelevant

Deployment

Official Apache Airflow Helm Chart

Deployment details

KubernetesExecutor PostgreSQL (RDS) as Airflow DB Python 3.9 Docker images build from apache/airflow:2.3.0-python3.9 (some additional libraries installed)

Anything else

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:41 (28 by maintainers)

github_iconTop GitHub Comments

4reactions
hterikcommented, Jun 14, 2022

After upgrading from 2.2.4 to 2.3.2. I get same error in the webserver when trying to view any task result of runs that were produced before the upgrade. Runs happening after the upgrade are still possible to view.

Eg

Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.9/site-packages/flask/app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "/home/airflow/.local/lib/python3.9/site-packages/flask/app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/home/airflow/.local/lib/python3.9/site-packages/flask/app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/home/airflow/.local/lib/python3.9/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/home/airflow/.local/lib/python3.9/site-packages/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/airflow/.local/lib/python3.9/site-packages/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/www/auth.py", line 43, in decorated
    return func(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/www/decorators.py", line 117, in view_func
    return f(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/www/decorators.py", line 80, in wrapper
    return f(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/utils/session.py", line 71, in wrapper
    return func(*args, session=session, **kwargs)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/www/views.py", line 2882, in graph
    return self.render_template(
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/www/views.py", line 608, in render_template
    return super().render_template(
  File "/home/airflow/.local/lib/python3.9/site-packages/flask_appbuilder/baseviews.py", line 287, in render_template
    return render_template(
  File "/home/airflow/.local/lib/python3.9/site-packages/flask/templating.py", line 137, in render_template
    return _render(
  File "/home/airflow/.local/lib/python3.9/site-packages/flask/templating.py", line 120, in _render
    rv = template.render(context)
  File "/home/airflow/.local/lib/python3.9/site-packages/jinja2/environment.py", line 1291, in render
    self.environment.handle_exception()
  File "/home/airflow/.local/lib/python3.9/site-packages/jinja2/environment.py", line 925, in handle_exception
    raise rewrite_traceback_stack(source=source)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/www/templates/airflow/graph.html", line 21, in top-level template code
    {% from 'appbuilder/loading_dots.html' import loading_dots %}
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/www/templates/airflow/dag.html", line 36, in top-level template code
    {% set execution_date_arg = request.args.get('execution_date') %}
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/www/templates/airflow/main.html", line 21, in top-level template code
    {% from 'airflow/_messages.html' import show_message %}
  File "/home/airflow/.local/lib/python3.9/site-packages/flask_appbuilder/templates/appbuilder/baselayout.html", line 2, in top-level template code
    {% import 'appbuilder/baselib.html' as baselib %}
  File "/home/airflow/.local/lib/python3.9/site-packages/flask_appbuilder/templates/appbuilder/init.html", line 50, in top-level template code
    {% block tail %}
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/www/templates/airflow/graph.html", line 137, in block 'tail'
    let taskInstances = {{ task_instances|tojson }};
  File "/home/airflow/.local/lib/python3.9/site-packages/flask/json/__init__.py", line 376, in tojson_filter
    return Markup(htmlsafe_dumps(obj, **kwargs))
  File "/home/airflow/.local/lib/python3.9/site-packages/flask/json/__init__.py", line 290, in htmlsafe_dumps
    dumps(obj, **kwargs)
  File "/home/airflow/.local/lib/python3.9/site-packages/flask/json/__init__.py", line 211, in dumps
    rv = _json.dumps(obj, **kwargs)
  File "/usr/local/lib/python3.9/json/__init__.py", line 234, in dumps
    return cls(
  File "/usr/local/lib/python3.9/json/encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/usr/local/lib/python3.9/json/encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/utils/json.py", line 84, in _default
    return PodGenerator.serialize_pod(obj)
  File "/home/airflow/.local/lib/python3.9/site-packages/airflow/kubernetes/pod_generator.py", line 404, in serialize_pod
    return api_client.sanitize_for_serialization(pod)
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/api_client.py", line 241, in sanitize_for_serialization
    return {key: self.sanitize_for_serialization(val)
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/api_client.py", line 241, in <dictcomp>
    return {key: self.sanitize_for_serialization(val)
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/api_client.py", line 237, in sanitize_for_serialization
    obj_dict = {obj.attribute_map[attr]: getattr(obj, attr)
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/api_client.py", line 239, in <dictcomp>
    if getattr(obj, attr) is not None}
  File "/home/airflow/.local/lib/python3.9/site-packages/kubernetes/client/models/v1_pod_spec.py", line 397, in ephemeral_containers
    return self._ephemeral_containers
AttributeError: 'V1PodSpec' object has no attribute '_ephemeral_containers'

Using same kubernetes python client as listed in the official constraints file (kubernetes==23.6.0) Cluster is a managed AKS, version 1.23.5

airflow dags reserialize did not help.

3reactions
dstandishcommented, Jun 14, 2022
Read more comments on GitHub >

github_iconTop Results From Across the Web

Release Notes — Airflow Documentation
Built-in operator classes that use this dep class (including sensors and all subclasses) already have this attribute and are not affected.
Read more >
Airflow scheduler is throwing out an error - 'DisabledBackend ...
Airflow scheduler is throwing out an error - 'DisabledBackend' object has no attribute '_get_task_meta_for' ... I am trying to install airflow ( ...
Read more >
What are the challenges when setting up Multi-Node Airflow ...
Worker was not able to communicate with Scheduler with Celery Executor. Error: AttributeError: 'DisabledBackend' object has no attribute ...
Read more >
airflow 12.5.5 · bitnami/bitnami - Artifact Hub
Apache Airflow is a tool to express and execute workflows as directed acyclic graphs (DAGs). It includes utilities to schedule tasks, monitor task...
Read more >
Check for multiple keys in different S3 buckets using ...
But we have requirement to check for multiple S3 Keys before starting ... x in args) AttributeError: 'list' object has no attribute 'decode'....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found