question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

WriteToStream - Backpressure Issue?

See original GitHub issue

fast-csv version: 4.3.0

I have a function that is used to write the contents of Array of Objects to a csv file with a specific header order. For small arrays I’ve not seen a problem but I have one array with over 15,000 objects and the if I execute the below the csv file only ends up with about 4,000 of them.

None of the error handling is emitted so finding the issue is problematic,

Do you think this could be a back-pressure issue, I’m not convinced but the silent failure makes me wonder?

const { writeToStream } = require('@fast-csv/format');

function writeCSV(filename, headers, data) {
try{
    console.log('Data Contains number of items: ' + data.length');
    const ws = fs.createWriteStream(filename);
    writeToStream(ws, data, { headers: headers, alwaysWriteHeaders: true })
    .on('error', (err) => {
        console.log('Error' + err);
      });
} catch (error) {
    console.log(error);  
}

The console shows there’s over 15,000 items as expected from the source but the .csv file contains just under 4,000.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
p-obriencommented, Jun 14, 2020

Good news, this issue seems to be related to my code not allowing enough time for the stream to fully flush down to disk. I was also using VSCode to view the CSV files when testing on my laptop and it looks like it or another process such as AV caused the file to not fully flush out every now and then.

After altering my code structure the lambda completes successfully, apologies for the run around.

0reactions
p-obriencommented, Jun 13, 2020

My Lambda has a timeout of 15 minutes and the workstation that I can reproduce on has no timeout. Will try to step through with breakpoints etc. and see if I can find anything.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Backpressuring in Streams - Node.js
There is a general problem that occurs during data handling called backpressure and describes a buildup of data behind a buffer during data...
Read more >
Backpressure With Observable Sequences | RxJS
In the case of lossy backpressure, the pausable operator can be used to stop listening and then resume listening at a later time...
Read more >
'continue' event for Readable streams · Issue #111 - GitHub
Answer: The chunk is pushed onto the Transform stream read buffer and is not lost. It will eventually be sent downstream and obey...
Read more >
How to detect if stream has backpressure? : r/node - Reddit
I'm learning about Node.js streams and often I see this backpressure problem. There are many arcticles on how to deal with backpressure.
Read more >
functional-streams-for-scala/fs2 - Gitter
Theres a way to preserve streamyness there, but thats the quick and dirty version that stops streaming. Definitely will lose backpressure, unless you...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found