question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Likely memory leak under high load in 3.1.0+

See original GitHub issue

Describe the bug When using the docker image of Enketo Express, we can notice the RAM usage is increasing constantly (with load). From a fresh start, RAM usage is about 600-800 MB but after few days it becomes several GB. Notes: It depends on number of requests received

To Reproduce Use a load tester such as siege or locust and hit the root url. http://enketo-express/. Set it to hit enketo with 3-4 requests/s. After 24 hours, the RAM used by EE docker container should be around 8 GB, after 3 days ~24 GB.

Expected behavior The RAM usage should stay below a reasonable amount (e.g 2GB max).

Screenshots None, but I can re-run my tests to get ones if needed

Browser and OS (please complete the following information):

  • Docker version 20.10.14, build a224086
  • Ubuntu 20.04 LTS

Additional context Enketo-Express docker container is behind NGINX which serves as the reverse proxy.

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:8 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
nolivelegercommented, Jun 27, 2022

Hello @lognaturel,

I finally tested 2.2.0, 2.8.1, 3.0.4, 3.0.5 and again 3.1.0 (just to be sure). The tests were exactly the same. I’ve used locust to hit / with about 3 requests/s for about 3 hours.

For all the versions except the latter, memory usage stayed around 800 MB. Using 3.1.0, it increased up to 2 GB. So it seems that 3.1.0 and up only are affected.

While digging a little bit in the diff of 3.1.0, the async context caught my attention. I think the package used for async context: express-cls-hooked is involved somehow.

Please have a look at this comment on GitHub repository of the package.

Moreover, express-cls-hooked uses cls-hooked. Some user claimed that library had a memory leak and pushed a PR to fix it.

1reaction
nolivelegercommented, Jun 23, 2022

Hello @lognaturel,

Do you believe this is new behavior?

Actually, I’ve seen this with 3.1.0 so I decided to test with the latest release available just to be sure it was not related to this particular version. So I guess it is not a new behaviour.

If you’ve been tracking these metrics before, what’s the last version you didn’t see this behavior with?

Unfortunately, I haven’t. I’ve discovered this issue during a load test (starting two weeks ago) to find bottlenecks in our setup when we are facing such a load. In my tests, I used Enketo root because it was easier to setup (i.e. no auth) and see how nodejs would behave, but in real case, I would only use the root for health check monitoring (so even if we can have 3-4 requests/s on API endpoints, root should receive such traffic). I can retry my tests with another GET API endpoint instead to validate it’s not only happening from /.

Have you changed anything about your configuration

I did not. My configuration is exactly the same. To be sure, my last load test was only hammering enketo to avoid any side effect with other containers.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Memory leak in 3.1 - Simplify3D User Forum
I just sliced several incredibly large parts over and over again, and can't find any signs of a memory leak. Tested on Windows...
Read more >
Understanding Memory Leaks in Java - Baeldung
The first scenario that can cause a potential memory leak is heavy use of static variables. In Java, static fields have a life...
Read more >
Is this very likely to create a memory leak in Tomcat?
The message is actually pretty clear: something creates a ThreadLocal with value of type org.apache.axis.MessageContext - this is a great hint.
Read more >
A Deep Dive into Memory Leaks in Ruby - AppSignal Blog
In the first part of this two-part series on memory leaks, we looked at how Ruby manages memory and how Garbage Collection (GC)...
Read more >
Debug a memory leak in .NET Core - Microsoft Learn
A memory leak may happen when your app references objects that it no longer needs to perform the desired task. Referencing said objects...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found