WriteToStream - Backpressure Issue?
See original GitHub issuefast-csv version: 4.3.0
I have a function that is used to write the contents of Array of Objects to a csv file with a specific header order. For small arrays I’ve not seen a problem but I have one array with over 15,000 objects and the if I execute the below the csv file only ends up with about 4,000 of them.
None of the error handling is emitted so finding the issue is problematic,
Do you think this could be a back-pressure issue, I’m not convinced but the silent failure makes me wonder?
const { writeToStream } = require('@fast-csv/format');
function writeCSV(filename, headers, data) {
try{
console.log('Data Contains number of items: ' + data.length');
const ws = fs.createWriteStream(filename);
writeToStream(ws, data, { headers: headers, alwaysWriteHeaders: true })
.on('error', (err) => {
console.log('Error' + err);
});
} catch (error) {
console.log(error);
}
The console shows there’s over 15,000 items as expected from the source but the .csv file contains just under 4,000.
Issue Analytics
- State:
- Created 3 years ago
- Comments:6 (3 by maintainers)
Top Results From Across the Web
Backpressuring in Streams - Node.js
There is a general problem that occurs during data handling called backpressure and describes a buildup of data behind a buffer during data...
Read more >Backpressure With Observable Sequences | RxJS
In the case of lossy backpressure, the pausable operator can be used to stop listening and then resume listening at a later time...
Read more >'continue' event for Readable streams · Issue #111 - GitHub
Answer: The chunk is pushed onto the Transform stream read buffer and is not lost. It will eventually be sent downstream and obey...
Read more >How to detect if stream has backpressure? : r/node - Reddit
I'm learning about Node.js streams and often I see this backpressure problem. There are many arcticles on how to deal with backpressure.
Read more >functional-streams-for-scala/fs2 - Gitter
Theres a way to preserve streamyness there, but thats the quick and dirty version that stops streaming. Definitely will lose backpressure, unless you...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Good news, this issue seems to be related to my code not allowing enough time for the stream to fully flush down to disk. I was also using VSCode to view the CSV files when testing on my laptop and it looks like it or another process such as AV caused the file to not fully flush out every now and then.
After altering my code structure the lambda completes successfully, apologies for the run around.
My Lambda has a timeout of 15 minutes and the workstation that I can reproduce on has no timeout. Will try to step through with breakpoints etc. and see if I can find anything.