EMFILE: too many open files
See original GitHub issueI’m running a job every second that instantiates a class, like so
const RedisService = require('../../services/redis-service');
...
const redis = new RedisService()
This job runs for awhile and then eventually goes into a fail loop with the error
EMFILE: too many open files
and points to ...services/redis-service.js
as the open that exceeded whatever ulimit -n
must be on my machine. I know that there are packages like graceful-fs
that might be able to solve this problem, but is this just because I’m using Bree incorrectly? Is there a proper way to close the files that a job opens when it is run by Bree?
Issue Analytics
- State:
- Created 3 years ago
- Comments:10 (6 by maintainers)
Top Results From Across the Web
node and Error: EMFILE, too many open files - Stack Overflow
After a number of searches I found a work around for the "too many open files" problem: var requestBatches = {}; function batchingReadFile(filename,...
Read more >EMFILE: too many open files, watch · Issue #923 - GitHub
i'm facing below issue while generating archive in xcode node:events:371 throw er; // Unhandled 'error' event ^ Error: EMFILE: too many open ......
Read more >How to fix the: "EMFILE: too many open files, watch" error in ...
A quick guide to how I solved a confusing React Native error.
Read more >Quick fix for “EMFILE: too many open files error” during ... - torrito
Use graceful-fs in your webpack.config.js:. “Quick fix for “EMFILE: too many open files error” during webpack build” is published by torrito.
Read more >Node js: Cannot start debugging: EMFILE: too many open files
Error: Error: EMFILE: too many open files, open 'd:\PrototypingQuick\VuejsApp1\VuejsApp1\node_modules\caniuse-lite\data\regions\AN.js'
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@alvin30595 I would suggest using the
workerMessageHandler
and do the logging work on the main process. Worker threads and logging is pretty difficult.Specifically I believe the issue was that as Node worker threads exited they did not necessarily flush their file streams, thus when the number of open file streams exceeded
ulimit
, the parent process crashed. I think.