question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[BUG] memory leak caused by overmind during SSR when using createOvermindSSR

See original GitHub issue

UPDATE 29 September 2020: We have discovered a memory leak.

On our server we use nodejs + express + server side rendering with overmind. During SSR we have several functions initialize, each one calls 1-3 effects.gql.queries.someGQLAction. Example:

export const initialize = withCatchDuringInitializeSSR(
  async ({ state, effects }: SSRConfig) => {
    const productsResponse = await effects.gql.queries.getProducts()

    const products = overmindNormalize(
      productsResponse.products,
      "id"
    );
    state.comet.products.data = products;
  },
  "comet"
);

This effect is based on Overmind graphql addon

Once we deployed the code we noticed memory leaks reflected in our memory charts:

Screen Shot 2020-09-28 at 3 55 20 PM

investigating with the help of heap profiler over consecutive snapshots we noticed that memory is held by one function:

Screen Shot 2020-09-28 at 1 19 41 PM

it seems this function is the source of the leak. We don’t know exactly why. Here is the stack:

  1. our initialize.ts function with effect call
  2. overmind-graphql internals
  3. graphql-request/src/index.ts --> GraphQLClient.prototype.request
  4. graphql-request/node-modules/node-fetch/lib/index.js json()

json() looks like:

    json() {
      var _this2 = this;
      return consumeBody.call(this).then(function (buffer) { // <--- leak is caused by this function
        try {
          return JSON.parse(buffer.toString());
        } catch (err) {
          return Body.Promise.reject(...));
        }
      });
    },

overmind-graphql 5.0.2 uses old version of graphql-request 1.8.2 (last 3.1.0), that in turn uses cross-fetch 2.2.2 (last 3.0.6) and cross fetch uses node-fetch 2.1.2 (last 2.6.1) I believe if overmind-graphql updates graphql-request to the latest version problem can vanish. Probably this PR https://github.com/cerebral/overmind/pull/447 can be just merged?

UPDATE 1 October 2020: We discovered the source of the leak. It is overmind itself, not the dependency. Not sure why heap profiler shows function from node-fetch as a bottleneck, probably because objects that are held in memory are coming from this function.

image

Overmind places Symbols into objects each time we render page on the server. If we supply config to createOvermindSSR it adds symbols there and since config is a global object it holds Symbols and Symbols hold references to objects which are not needed once rendered page sent back to client. Garbage collector can’t free them up.

it also happens if let’s say I have global constant:

export const PRODUCTS = { product_1: { array_1: [] }, product_2: { array_2: [] }  }

and then during SSR I place it into state:

state.products = PRODUCTS

then Overmind pollutes this global constant with Symbols on each server page render which leads to memory leak.

If it is still not clear @christianalfoni I will try to provide reproducible demo later once I have time.

UPDATE 1 October 2020:

I created small repo with reproduction of the problem: https://github.com/SergeiKalachev/overmind-ssr-leak-demo

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:4
  • Comments:11 (7 by maintainers)

github_iconTop GitHub Comments

1reaction
christianalfonicommented, Feb 19, 2021

Great! Thanks for reporting 😄

1reaction
SergeiKalachevcommented, Feb 19, 2021

@christianalfoni sorry for the late reply. I had a chance to test it. I upgraded to the version 27.0.0 and removed my workaround for avoiding memory leak. It seems the memory leak is no longer there:

Screen Shot 2021-02-19 at 9 12 48 AM

Chart looks quite smooth. It goes slowly up but I think it will reach some point and stop at it (based on the data for previous days). Thank you very much for the fix!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Learn How to Debug the Cause of Memory Leak in SSR
It means probably there is a memory leak in Node.js. You can increase the maximum size of heap memory manually with --max_old_space_size option...
Read more >
FIX: A memory leak occurs when SQL Server procedure cache ...
Fixes an issue that occurs when SQL Server procedure cache consumes too much memory with several entries and the Query Store is turned...
Read more >
Hunting memory leaks in a server side rendered React ...
Our memory leak was caused by reselect and with the bad usage of styled-components, both problems were found by using Chrome DevTools. Backstory....
Read more >
Understanding Memory Leaks in Nodejs | by Chidume Nnamdi
Once we begin to type that code, we already introduce bugs and allocating memory without knowing it. How we manage them can make...
Read more >
SQL Server Occupies more than allocated memory. Possible ...
My questions are Is SQL Server leaking memory? Highly unlikely but you should plan for SQL Server 2012 SP4 ASAP. From my past...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found