question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

"Too many open files" in debug mode

See original GitHub issue

When running in the debug mode, the call to df -h /tmp will be run from https://github.com/airbnb/streamalert/blob/cf12209aa071cd4b2f85bfc5b77446bbb696d451/stream_alert/rule_processor/payload.py#L207

This may result in an exception being thrown:

[Errno 24] Too many open files: OSError
Traceback (most recent call last):
File "/var/task/stream_alert/rule_processor/main.py", line 39, in handler
StreamAlert(context).run(event)
File "/var/task/stream_alert/rule_processor/handler.py", line 148, in run
self._process_alerts(payload)
File "/var/task/stream_alert/rule_processor/handler.py", line 356, in _process_alerts
for record in payload.pre_parse():
File "/var/task/stream_alert/rule_processor/payload.py", line 155, in pre_parse
s3_file = self._get_object()
File "/var/task/stream_alert/rule_processor/payload.py", line 245, in _get_object
return self._download_object(region, bucket, key)
File "/var/task/stream_alert/rule_processor/payload.py", line 206, in _download_object
LOGGER.debug(os.popen('df -h /tmp | tail -1').read().strip())
OSError: [Errno 24] Too many open files

This will only present itself if debug output is on, and this may be masking another issue for myself, but perhaps this case should checked somehow (where you may have too many files in /tmp?). I also don’t know why my /tmp would have too many files in it, or if I should clear that out somehow. Based on lambda limits, the /tmp directory should be limited to 512MB anyway ( https://docs.aws.amazon.com/lambda/latest/dg/limits.html ), and the max file descriptors should be set to 1024.

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:9 (9 by maintainers)

github_iconTop GitHub Comments

1reaction
0xdabbad00commented, Feb 9, 2018

I’m digging into this to find the root cause. I’ll update this as I figure things out, but I suspect the issue might be isolated to my environment.

0reactions
ryandeivertcommented, Mar 27, 2018

we’ve seen this internally an can confirm this is not fixed

Read more comments on GitHub >

github_iconTop Results From Across the Web

Debugging the "Too many files open" issue - Stack Overflow
+1 this helped me to debug an android app with a bad usage of a Process. Runtime.exec() with system's ping. I had to...
Read more >
[Performance Debugging] : Root causing “Too many open files ...
As most of us already know, we see “Too many open files” error when the total number of open file descriptors crosses the...
Read more >
How to diagnose 'TOO MANY OPEN FILES' issues? - IBM
the problem is due to a configuration too small for the current needs. Sometimes as well it might be that the process is...
Read more >
How To Fix Sahi's "Too Many Open Files" Error - Red Crackle
You can do this by executing "lsof -p <process-id>" where <process-id> is Sahi's process id. We noticed that Sahi is opening multiple socket...
Read more >
Fixing the “Too many open files” Error in Linux - Baeldung
When working with Linux servers, we may encounter the “Too many open files” error. In this article, we'll go over what this error...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found