Invalid input syntax for the type timestamp
See original GitHub issueHello, thank you very much for the nice tool. I am evaluating it for a data integration use case and am experiencing issues when I initialize a new Postgres slave. The process is not successful for a table that contains NULLable DATETIME fields.
I am using:
- pg_chameleon 2.0.6
- MySQL 5.6 (source)
- Postgres 9.6 (target)
The replication process starts without errors, but when it reaches said table, I receive the following error for about 90% of the rows, because the DATETIME column is not set (which is legal in my data model):
2018-05-17 15:01:17 MainProcess ERROR pg_lib.py (3414): error when inserting the row, skipping the row
2018-05-17 15:01:17 MainProcess WARNING pg_lib.py (3409): character mismatch when inserting the data, trying to cleanup the row data
Looking at the Postgres log, I could see the following error message:
ERROR: invalid input syntax for type timestamp: "None" at character 596
The charater position is at an INSERT statement where the value is ‘None’ for a DATETIME field. I would expect that to be translated by psycopg2, but this seems not to be the case. I can see that the error occurs always when the MySQL DATETIME field of that table is NULL. It seems that the script tries to insert ‘None’ strings instead of NULL which causes Postgres to complain, if I am not misstaken
The method causing the error is this, starting at line 3388 of pg_lib.py where the ValueError is thrown.
def insert_data(self, schema, table, insert_data , column_list):
"""
The method is a fallback procedure for when the copy method fails.
The procedure performs a row by row insert, very slow but capable to skip the rows with problematic data (e.g. encoding issues).
:param schema: the schema name where table belongs
:param table: the table name where the data should be inserted
:param insert_data: a list of records extracted from the database using the unbuffered cursor
:param column_list: the list of column names quoted for the inserts
"""
sample_row = insert_data[0]
column_marker=','.join(['%s' for column in sample_row])
sql_head='INSERT INTO "%s"."%s"(%s) VALUES (%s);' % (schema, table, column_list, column_marker)
for data_row in insert_data:
try:
self.pgsql_cur.execute(sql_head,data_row)
except psycopg2.Error as e:
self.logger.error("SQLCODE: %s SQLERROR: %s" % (e.pgcode, e.pgerror))
self.logger.error(self.pgsql_cur.mogrify(sql_head,data_row))
except ValueError:
self.logger.warning("character mismatch when inserting the data, trying to cleanup the row data")
data_row = [str(item).replace("\x00", "") for item in data_row]
try:
self.pgsql_cur.execute(sql_head,data_row)
except:
self.logger.error("error when inserting the row, skipping the row")
except:
self.logger.error("unexpected error when processing the row")
self.logger.error(" - > Table: %s.%s" % (schema, table))
Any ideas if that is really the cause?
Issue Analytics
- State:
- Created 5 years ago
- Comments:5 (5 by maintainers)

Top Related StackOverflow Question
the release 2.0.7 is out with the fix. thanks for the feedback.
I think I’ve found the real issue and I’ve additionally fixed the same problem during the replica. The last commit should make it work properly.