question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error when importing huge yml/json (larger than 65KB) with a MySQL database: Data too long for column 'data' at row 1

See original GitHub issue

Hello,

I tried importing a huge json api file (332Ko)
The operation failed with the following error:

org.jdbi.v3.core.statement.UnableToExecuteStatementException:
com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'data' at row 1
[
  statement:"INSERT INTO api_content (design_id, type, data, created_by, created_on) VALUES (?, ?, ?, ?, ?)",
  rewritten:"INSERT INTO api_content (design_id, type, data, created_by, created_on) VALUES (?, ?, ?, ?, ?)",
  parsed:"ParsedSql{sql='INSERT INTO api_content (design_id, type, data, created_by, created_on) VALUES (?, ?, ?, ?, ?)',
  parameters=ParsedParameters{positional=true, parameterNames=[?, ?, ?, ?, ?]}}",
  arguments:{positional:{0:5,1:1,2:<stream object cannot be read for toString() calls>,3:n.lim@pyres.com,4:Mon Mar 11 11:38:40 UTC 2019},
  named:{},
  finder:[]}
]

I’ve quickly looked into it
Here is the structure of the api_content table:

mysql> DESCRIBE api_content;
+------------+--------------+------+-----+---------+----------------+
| Field      | Type         | Null | Key | Default | Extra          |
+------------+--------------+------+-----+---------+----------------+
| design_id  | bigint(20)   | NO   | MUL | NULL    |                |
| version    | bigint(20)   | NO   | PRI | NULL    | auto_increment |
| type       | tinyint(4)   | NO   | MUL | NULL    |                |
| data       | text         | NO   |     | NULL    |                |
| created_by | varchar(255) | NO   | MUL | NULL    |                |
| created_on | datetime     | NO   | MUL | NULL    |                |
| reverted   | tinyint(4)   | NO   | MUL | 0       |                |
| modifed_on | datetime     | YES  |     | NULL    |                |
+------------+--------------+------+-----+---------+----------------+

The field data is a TEXT
It turns out that TEXT fields are limited to 64KB, which is very small

Luckily it’s very easy to fix
Changing it to MEDIUMTEXT (16MB) fixes the issue, and I was able to import the api spec successfully

mysql> ALTER TABLE api_content MODIFY data MEDIUMTEXT;
Query OK, 25 rows affected (0.68 sec)
Records: 25  Duplicates: 0  Warnings: 0

mysql> DESCRIBE api_content;
+------------+--------------+------+-----+---------+----------------+
| Field      | Type         | Null | Key | Default | Extra          |
+------------+--------------+------+-----+---------+----------------+
| design_id  | bigint(20)   | NO   | MUL | NULL    |                |
| version    | bigint(20)   | NO   | PRI | NULL    | auto_increment |
| type       | tinyint(4)   | NO   | MUL | NULL    |                |
| data       | mediumtext   | YES  |     | NULL    |                |
| created_by | varchar(255) | NO   | MUL | NULL    |                |
| created_on | datetime     | NO   | MUL | NULL    |                |
| reverted   | tinyint(4)   | NO   | MUL | 0       |                |
| modifed_on | datetime     | YES  |     | NULL    |                |
+------------+--------------+------+-----+---------+----------------+
8 rows in set (0.00 sec)

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
Nicnlcommented, Mar 11, 2019

You sir are an absolute madman
I’ve never seen in my life such a nice response and fast bug fix

Kudos

0reactions
EricWittmanncommented, Mar 11, 2019

Yes there are different DDLs for each of the supported databases, fortunately. So I’m switching MySQL to use MEDIUMTEXT and leaving the others as they are. I did consider LONGTEXT but it seems like overkill. I’ll wait for someone else to run into the 16MB limitation and complain about it and then deal with it then. 😃 I’m guessing it won’t happen - although I suppose if someone is programmatically generating OpenAPI documents it might be possible…

Read more comments on GitHub >

github_iconTop Results From Across the Web

What is the MySQL error: “Data too long for column”?
The “Data too long for column” error occurs when you insert more data for a column that does not have the capability to...
Read more >
mysql - "Data too long for column" - why? - Stack Overflow
When executing the INSERT command for Description I get an error: error 1406: Data too long for column 'Description' at row 1. All...
Read more >
Troubleshooting Row Size Too Large Errors with InnoDB
The original table schema shown earlier on this page causes the Row size too large error, because all of the table's VARCHAR columns...
Read more >
MySQL data too long error - DBA Stack Exchange
I had a column that was VARCHAR(30), and I used ALTER TABLE myTable MODIFY COLUMN myColumn VARCHAR(60) NOT NULL, to double its size....
Read more >
2 Server Error Message Reference - MySQL :: Developer Zone
Message: Result consisted of more than one row ... Message: Uncompressed data size too large; the maximum size is %d (probably, length of...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found