read ECONNRESET error is being thrown when trying to insert large size files
  • 30-May-2023
Lightrun Team
Author Lightrun Team
read ECONNRESET error is being thrown when trying to insert large size files

“read ECONNRESET” error is being thrown when trying to insert large size files.

Lightrun Team
Lightrun Team

Explanation of the problem


I am encountering an issue when attempting to insert large-sized files, resulting in an ECONNRESET error. To provide some context, I have established a database connection pool using the mysql library, and I am acquiring a connection using the pool.getConnection() method. The problem occurs during the following scenario:

  1. I receive a zip file via a web request.
  2. Next, I read the file data using the fs.readFile() method.
  3. Finally, I attempt to insert the file stream data into a table’s column with the longblob data type.

Interestingly, I have observed that when the zip file’s size remains below 2 MB to 3 MB, no issues arise. However, once the file size exceeds this threshold, the error occurs.

Troubleshooting with the Lightrun Developer Observability Platform


Getting a sense of what’s actually happening inside a live application is a frustrating experience, one that relies mostly on querying and observing whatever logs were written during development.
Lightrun is a Developer Observability Platform, allowing developers to add telemetry to live applications in real-time, on-demand, and right from the IDE.

  • Instantly add logs to, set metrics in, and take snapshots of live applications
  • Insights delivered straight to your IDE or CLI
  • Works where you do: dev, QA, staging, CI/CD, and production

Start for free today

Problem solution for “read ECONNRESET” error is being thrown when trying to insert large size files.

The ECONNRESET error occurs when attempting to insert large files into a MySQL database due to limitations in the max_allowed_packet configuration. By default, MySQL restricts the maximum packet size for communication between the client and server. When the size of the file being inserted exceeds this limit, the connection is forcibly reset, resulting in the ECONNRESET error. To address this, it is necessary to adjust the max_allowed_packet configuration in the MySQL server settings. This configuration determines the maximum packet size that MySQL can handle. By increasing the max_allowed_packet value in the my.ini or my.cnf file, you can accommodate larger packets. For example, setting max_allowed_packet=100M raises the limit to 100 megabytes. After modifying the configuration, it is crucial to restart the MySQL instance service to apply the changes.

Another aspect to consider when dealing with large file insertions is memory management and resource utilization. Reading and inserting the entire file as a single operation can exhaust system memory and lead to connection errors. To mitigate this, a batch processing approach can be implemented. The provided code snippet demonstrates a loop-based mechanism that handles requests in smaller batches. The loopOverRequests function iterates over the data and executes the makeTheRequest function for each element. This asynchronous function handles the individual requests, ensuring that each request is executed sequentially and completed before moving on to the next iteration. By dividing the data into manageable chunks and processing them sequentially, you can alleviate memory-related issues and maintain stable connections throughout the insertion process.

In summary, addressing the ECONNRESET error during large file insertions involves adjusting the max_allowed_packet configuration to accommodate larger packets and implementing a batch processing mechanism to handle requests in smaller, manageable chunks. By increasing the maximum packet size, the MySQL server can handle larger file inserts without triggering the ECONNRESET error. Additionally, processing requests in smaller batches improves memory management and helps prevent connection errors caused by excessive resource consumption. It is essential to evaluate the specific needs of your application and database setup when applying these solutions. Consider factors such as the file size, available system resources, and the network environment to ensure optimal performance and reliability during large file insertions into the MySQL database.

Problems with node-postgres


Problem: Authentication failure or access denied errors.

Another common problem encountered while using node-postgres is receiving authentication failure or access denied errors when attempting to connect to the PostgreSQL database. These errors often occur due to incorrect credentials or insufficient permissions.


To address authentication failure or access denied errors, verify the correctness of the provided user credentials and password. Double-check the values of user and password used while creating the Pool object. Ensure that the specified user has the necessary privileges to access the specified database. If required, update the user credentials or grant appropriate permissions to resolve the authentication failure or access denied errors.

Problem: Inconsistent or unexpected behavior when executing queries.

Developers may sometimes encounter unexpected or inconsistent behavior when executing queries with node-postgres. This can include incorrect results, missing data, or inconsistent data retrieval.


When facing such issues, it is crucial to review the SQL queries and their parameters carefully. Pay attention to the syntax, table names, column names, and any filter conditions being used in the queries. Ensure that the queries are structured correctly and accurately reflect the desired operation. Furthermore, validate the data being passed as parameters to avoid any type mismatches or unexpected behavior. Debugging the queries and carefully examining the data flow can help identify and resolve any inconsistencies or unexpected behavior during query execution.


A brief introduction to node-postgres


Node-postgres is a widely used library for interacting with PostgreSQL databases in Node.js applications. It provides a comprehensive set of functionalities, allowing developers to establish connections, execute SQL queries, and perform database operations efficiently.

The library leverages the non-blocking I/O capabilities of Node.js to handle multiple concurrent database connections effectively. It offers a connection pool mechanism that manages a pool of reusable database connections, reducing the overhead of establishing a new connection for each query. The pool can be configured with various parameters such as the maximum number of connections, idle timeouts, and SSL support, providing flexibility and control over connection management.

Node-postgres supports executing both parameterized and raw SQL queries. With parameterized queries, developers can safely pass user input as query parameters, protecting against SQL injection attacks. The library also facilitates handling query results, including fetching rows, streaming result sets, and handling large object (LOB) data efficiently.

Furthermore, node-postgres supports advanced features like transaction management, listening for database notifications, and executing batch queries for improved performance. It also provides comprehensive error handling and event-based callbacks to handle database errors and asynchronous query execution.

In summary, node-postgres is a powerful and feature-rich library that enables seamless integration between Node.js applications and PostgreSQL databases. Its robust functionality, connection pooling capabilities, and support for advanced database operations make it a popular choice for developers working with Node.js and PostgreSQL.


Most popular use cases for node-postgres

  1. Node-postgres is primarily used for establishing connections with PostgreSQL databases from Node.js applications. It provides a comprehensive API to create connections, execute SQL queries, and interact with the database. Developers can use the library to establish connection pools, manage multiple concurrent connections efficiently, and execute parameterized or raw SQL queries. Here is an example of establishing a connection and executing a query using node-postgres:
  1. Advanced Database Operations: Node-postgres supports advanced database operations, making it suitable for complex application requirements. It provides features such as transaction management, allowing developers to execute multiple queries within a single transaction and ensure data consistency. The library also supports listening for database notifications, enabling real-time updates from the database. Additionally, node-postgres allows executing batch queries, which improves performance by reducing the overhead of individual query execution. These advanced capabilities empower developers to handle complex database scenarios effectively.
  2. Data Modeling and ORM Integration: Node-postgres can be used for data modeling and integration with Object-Relational Mapping (ORM) libraries. Developers can utilize node-postgres to create models and schemas, map them to database tables, and perform CRUD (Create, Read, Update, Delete) operations on the data. The library integrates well with popular Node.js ORM frameworks such as Sequelize or TypeORM, allowing developers to leverage the power of node-postgres while benefiting from the higher-level abstractions and convenience provided by ORM libraries.

It’s Really not that Complicated.

You can actually understand what’s going on inside your live applications. It’s a registration form away.

Get Lightrun

Lets Talk!

Looking for more information about Lightrun and debugging?
We’d love to hear from you!
Drop us a line and we’ll get back to you shortly.

By submitting this form, I agree to Lightrun’s Privacy Policy and Terms of Use.