question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Scheduler keeps exiting with AttributeError: 'MySQLConverter' object has no attribute '_dagruntype_to_mysql' when using docker 2.0.0b2 image

See original GitHub issue

I am trying to setup a installation of airflow using the docker image 2.0.0b2-python3.7. I am using MySQL 5.7 from GCP CloudSQL. The scheduler keeps exiting with below error

____________       _____________
 ____    |__( )_________  __/__  /________      __
____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
 _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
[2020-11-17 04:09:34,142] {scheduler_job.py:1249} INFO - Starting the scheduler
[2020-11-17 04:09:34,143] {scheduler_job.py:1254} INFO - Processing each file at most -1 times
[2020-11-17 04:09:34,144] {kubernetes_executor.py:520} INFO - Start Kubernetes executor
[2020-11-17 04:09:34,169] {kubernetes_executor.py:126} INFO - Event: and now my watch begins starting at resource_version: 0
[2020-11-17 04:09:34,259] {kubernetes_executor.py:462} INFO - When executor started up, found 0 queued task instances
[2020-11-17 04:09:34,306] {dag_processing.py:250} INFO - Launched DagFileProcessorManager with pid: 39
[2020-11-17 04:09:34,308] {scheduler_job.py:1761} INFO - Resetting orphaned tasks for active dag runs
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:332 
DeprecationWarning: The logging_level option in  has been moved to the 
logging_level option in  - the old setting has been used, but please update your
config.
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:332 
DeprecationWarning: The fab_logging_level option in  has been moved to the 
fab_logging_level option in  - the old setting has been used, but please update 
your config.
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:332 
DeprecationWarning: The remote_logging option in  has been moved to the 
remote_logging option in  - the old setting has been used, but please update 
your config.
[2020-11-17 04:09:34,322] {settings.py:52} INFO - Configured default timezone Timezone('UTC')
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:332 
DeprecationWarning: The task_log_reader option in  has been moved to the 
task_log_reader option in  - the old setting has been used, but please update 
your config.
[2020-11-17 04:09:34,491] {scheduler_job.py:1301} ERROR - Exception when executing SchedulerJob._run_scheduler_loop
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/conversion.py", line 183, in to_mysql
    return getattr(self, "_{0}_to_mysql".format(type_name))(value)
AttributeError: 'MySQLConverter' object has no attribute '_dagruntype_to_mysql'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/cursor.py", line 410, in _process_params_dict
    conv = to_mysql(conv)
  File "/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/conversion.py", line 186, in to_mysql
    "MySQL type".format(type_name))
TypeError: Python 'dagruntype' cannot be converted to a MySQL type
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
    cursor, statement, parameters, context
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
    cursor.execute(statement, parameters)
  File "/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/cursor.py", line 555, in execute
    stmt, self._process_params_dict(params))
  File "/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/cursor.py", line 419, in _process_params_dict
    "Failed processing pyformat-parameters; %s" % err)
mysql.connector.errors.ProgrammingError: Failed processing pyformat-parameters; Python 'dagruntype' cannot be converted to a MySQL type
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 1283, in _execute
    self._run_scheduler_loop()
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 1357, in _run_scheduler_loop
    self.adopt_or_reset_orphaned_tasks()
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 63, in wrapper
    return func(*args, **kwargs)
  File "/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 1797, in adopt_or_reset_orphaned_tasks
    tis_to_reset_or_adopt = with_row_locks(query, of=TI, **skip_locked(session=session)).all()
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3346, in all
    return list(self)
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3508, in __iter__
    return self._execute_and_instances(context)
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3533, in _execute_and_instances
    result = conn.execute(querycontext.statement, self._params)
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
    return meth(self, multiparams, params)
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1130, in _execute_clauseelement
    distilled_params,
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1317, in _execute_context
    e, statement, parameters, cursor, context
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1511, in _handle_dbapi_exception
    sqlalchemy_exception, with_traceback=exc_info[2], from_=e
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
    raise exception
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
    cursor, statement, parameters, context
  File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
    cursor.execute(statement, parameters)
  File "/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/cursor.py", line 555, in execute
    stmt, self._process_params_dict(params))
  File "/home/airflow/.local/lib/python3.7/site-packages/mysql/connector/cursor.py", line 419, in _process_params_dict
    "Failed processing pyformat-parameters; %s" % err)
sqlalchemy.exc.ProgrammingError: (mysql.connector.errors.ProgrammingError) Failed processing pyformat-parameters; Python 'dagruntype' cannot be converted to a MySQL type
[SQL: SELECT task_instance.task_id AS task_instance_task_id, task_instance.dag_id AS task_instance_dag_id, task_instance.execution_date AS task_instance_execution_date 
FROM task_instance LEFT OUTER JOIN job ON job.id = task_instance.queued_by_job_id INNER JOIN dag_run ON task_instance.dag_id = dag_run.dag_id AND task_instance.execution_date = dag_run.execution_date 
WHERE task_instance.state IN (%(state_1)s, %(state_2)s, %(state_3)s) AND (task_instance.queued_by_job_id IS NULL OR job.state != %(state_4)s) AND dag_run.run_type != %(run_type_1)s AND dag_run.state = %(state_5)s FOR UPDATE]
[parameters: {'state_1': 'scheduled', 'state_2': 'queued', 'state_3': 'running', 'state_4': 'running', 'run_type_1': <DagRunType.BACKFILL_JOB: 'backfill'>, 'state_5': 'running'}]
(Background on this error at: http://sqlalche.me/e/13/f405)
[2020-11-17 04:09:35,498] {process_utils.py:95} INFO - Sending Signals.SIGTERM to GPID 39
[2020-11-17 04:09:35,711] {process_utils.py:61} INFO - Process psutil.Process(pid=39, status='terminated', exitcode=0, started='04:09:34') (39) terminated with exit code 0
[2020-11-17 04:09:35,712] {scheduler_job.py:1304} INFO - Exited execute loop

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

2reactions
ashbcommented, Nov 17, 2020

This may be a problem with SQLA+MySQL Connector. From SQLA’s mysql docs:

The MySQL Connector/Python DBAPI has had many issues since its release, some of which may remain unresolved, and the mysqlconnector dialect is not tested as part of SQLAlchemy’s continuous integration. The recommended MySQL dialects are mysqlclient and PyMySQL.

1reaction
anujjamwalcommented, Nov 17, 2020

Using mysql+mysqldb fixed the issue.

Read more comments on GitHub >

github_iconTop Results From Across the Web

[GitHub] [airflow] anujjamwal commented on issue #12400
... Scheduler keeps exiting with AttributeError: 'MySQLConverter' object has no attribute '_dagruntype_to_mysql' when using docker 2.0.0b2 ...
Read more >
AttributeError: 'MySQLConverter' object has no attribute ...
The error occurs when looping through a tuple and ...
Read more >
'MySQLConverter' object has no attribute ... - GitHub
Hello, This code #!/usr/bin/env python # -*- coding: utf-8 -*- import pandas as pd import numpy as np import sqlalchemy def main(): N...
Read more >
'MySQLConverter' object has no attribute '_list_to_mysql
AttributeError : 'MySQLConverter' object has no attribute '_list_to_mysql' TypeError: Python 'list' cannot be converted to a MySQL type ...
Read more >
dragonfly 0.5.4
Key Type Default cdn.config.base.enableProfiler bool false cdn.config.base.failAccessInterval string "3m" cdn.config.base.gcInitialDelay string "6s"
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found