question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

res.write not sending chunks until res.end() is called

See original GitHub issue

I’m using node v.0.12.0 with express (v4.4.1) and compression (v1.6.0)

I’m sending back about 80MB of dynamically generated data (not from FS or DB) in multiple res.write() calls. When I add the compression middle ware (with no options passed), I don’t see any traffic (using WireShark) from the Node server until the res.end() is called. When res.end() is called, there is a sudden burst of chunked data.

But when the compression module is not used, I see chunked responses going on the wire. The size of each of these chunked fragments seems to be between 10K and 16K (based on WireShark).

The only header information I am setting happens before the res.write() calls and is:

res.setHeader('Content-Type', 'application/json');

Any reason why the data is getting buffered before the res.end() call?

Issue Analytics

  • State:closed
  • Created 8 years ago
  • Reactions:1
  • Comments:13 (8 by maintainers)

github_iconTop GitHub Comments

8reactions
bijoythomascommented, Oct 22, 2015

So I figured out what is happening. The same issue exists in the last version of Node too. The way I am sending data is something like below

for (i-0; i<some_large_number; i++) {
 poppulate_data_in_array(arr);
 if (arr.length === 500) {
   res.write(SON.stringify(arr));
   res.flush();
   arr = [];
}

// Finally if there is anything in array, send it across too

A look at the zlib.js source https://github.com/nodejs/node/blob/master/lib/zlib.js line 448 shows that the flush() call is not synchronous. Instead, it sets up a ‘drain’ listener to execute later. But with my for-loop consuming the thread, this listener has no chance to run. In fact, the loop will keep adding the listeners for ’ drain’ and that explains why I was getting the error below: (node) warning: possible EventEmitter memory leak detected. 11 drain listeners added

I changed my code to allow the execution of the listener by making use of the callback that the Zlib.flush() takes and trampoline between data generation and sending it. So it looks something like below:

function populateArray() {
    if (iterCount < maxCount) {
      arr.push(generate_some_data());
    }
    send();
}

function send() {
    if (iterCount === maxCount) {
      if (arr.length > 0) {
        res.write(JSON.stringify(arr));
      }
      res.end();
      return;
   }

    if (arr.length === 500) {
      res.write(JSON.stringify(arr));
      res.flush(function() { // <--------- callback to flush which on invocation resumes the array population
        arr = [];
        populateArray(++iterCount);

      });
    } else {
      populateArray(++iterCount);
    }
}

In order to get this to work, I had to change the flush implementation to take a callback

res.flush = function flush(cb) {
      if (stream) {
        stream.flush(opts.flush, cb);
      }   
}

Where opts is the variable that captures the options passed into the compression function. The Zlib flush implementation takes the flush mode but I’m not aware of a practical use case of using multiple flush modes for a single compression run. Or if there is one, the function could take the addition flush argument but the user has to make sure the right mode is passed in.

4reactions
dougwilsoncommented, Oct 21, 2015

This is because it’s how compression works with gzip: to get small sizes, gzip needs to accumulate the payload so it can do substring search and replacements.

If you want to stream, you can use the res.flush() command between writes, but the compression will be much less efficient.

An example can be found at the bottom of the readme.

Read more comments on GitHub >

github_iconTop Results From Across the Web

node.js - res.write not sending big data until res.end() is called ...
res.write not sending big data until res.end() is called after res.write but don't want to end response because it is SSE connection.
Read more >
Stream with Node.js doesn't work | by Jamie Munro - Medium
Seems that I have to have res.end() after the last res.write() to be able to send the data to the browser. Actually, it...
Read more >
Top 10 Most Common Node.js Developer Mistakes - Toptal
Mistake #2: Invoking a Callback More Than Once ... Notice how there is a return statement every time “done” is called, up until...
Read more >
HTTP | Node.js v18 API
For efficiency reasons, Node.js normally buffers the request headers until request.end() is called or the first chunk of request data is written.
Read more >
Anatomy of an HTTP Transaction | Node.js
(Though it's probably best to send some kind of HTTP error response. ... To do this, there's a method called writeHead , which...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found