question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

TypeError: value is out of bounds when Insert 1M rows from .sql file

See original GitHub issue
TypeError: value is out of bounds
    at checkInt (buffer.js:825:11)
    at Buffer.writeUInt16LE (buffer.js:883:5)
    at Packet.writeInt16 (/var/www/public/node_modules/mysql2/lib/packets/packet.js:527:15)
    at Packet.writeInt24 (/var/www/public/node_modules/mysql2/lib/packets/packet.js:523:8)
    at Packet.writeHeader (/var/www/public/node_modules/mysql2/lib/packets/packet.js:593:8)
    at Connection.writePacket (/var/www/public/node_modules/mysql2/lib/connection.js:142:10)
    at Query.start (/var/www/public/node_modules/mysql2/lib/commands/query.js:39:14)
    at Query.Command.execute (/var/www/public/node_modules/mysql2/lib/commands/command.js:34:20)
    at Connection.handlePacket (/var/www/public/node_modules/mysql2/lib/connection.js:310:28)
    at Connection.addCommand (/var/www/public/node_modules/mysql2/lib/connection.js:326:10)

any ideas?

Issue Analytics

  • State:open
  • Created 8 years ago
  • Comments:13 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
sidorarescommented, Oct 1, 2016

feel free to suggest pr, but I’d rather add extra checks where required (for example, one source of problem here is me trying to store (buffer length >> 16) as byte value - which is out of range for length > 16m). So correct code here should 1) check for 3rd byte to be < 256 2) actually implement the way mysql handles big buffers

On the other hand try/catch around user code ( callbacks ) are necessary

0reactions
sidorarescommented, Nov 15, 2016

closing this ( big packet support in master, going to npm as 1.1.2 soon ) exception safety tracking issue is #419

Read more comments on GitHub >

github_iconTop Results From Across the Web

Get Error when Insert 1M rows from .sql file #1322 - GitHub
Hi! I am trying node-mysql and node-mysql2, so it makes me confused, sorry about it. The code is run on nodejs v5.0.0, MYSQL...
Read more >
Writing 1.5 million rows in a file using Python 3+ - Stack Overflow
I suppose the code is heavily I/O-bound, and mostly waits for writes to complete. Several parallel processes could generate some more data ...
Read more >
Working With Line Numbers and Errors Using Bulk Insert
In this blog post, we look at these techniques using T-SQL's native bulk insert (Line Numbers and Errors Using Bulk Insert).
Read more >
Resolved Issues: Polarion ® ALM™ Platform - Siemens PLM
DPP-209971, Index out of bounds error when comparing bigger table with merged cells, Issue. DPP-209483, Clustering: The configuration for location /polarion ...
Read more >
sql server - Generate and Insert 1 million rows into simple table
Question: After researching , I found 4 solutions. Are there any better solution (not using copy data from files) ? sql ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found