question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Memory leak with optimistic response

See original GitHub issue

When using mutations with an optimisticResponse we found a significant increase in memory usage.

Intended outcome: No or only small increase in memory usage.

Actual outcome: Huge increase in memory usage.

How to reproduce the issue: Given an application with multiple watched queries:

  • take a memory snapshot
  • perform a mutation with an optimistic response (it is sufficient to update a single field of some object)
  • take another memory snapshot

Alternatively use the memory profiling function of Chrome instead of taking separate snapshots - this has the advantage that we get callstacks for the allocations.

Versions

  • apollo-cache-inmemory: 1.3.8
  • apollo-client: 2.4.5
  • apollo-link: 1.2.3
  • apollo-link-context: 1.0.9
  • apollo-link-error: 1.1.1
  • apollo-link-http: 1.5.5
  • react-apollo: 2.2.4

We suspect that this was introduced in #3394. In particular I suspect the commit 45c4169fa9bf0b4c61e3273254e851e05905a431 to be the main cause.

If InMemoryCache.optimistic contains some entry, read and diff create temporary stores with that optimistic response: https://github.com/apollographql/apollo-client/blob/48a224d504487ea480bce95685402d40c4a90eea/packages/apollo-cache-inmemory/src/inMemoryCache.ts#L137-L139 https://github.com/apollographql/apollo-client/blob/48a224d504487ea480bce95685402d40c4a90eea/packages/apollo-cache-inmemory/src/inMemoryCache.ts#L167-L169 These temporary stores are than used in the calls to readQueryFromStore/ diffQueryAgainstStore and should later get garbage collected. But unfortunately these store objects are internally used as key in calls to cacheKeyRoot.lookup: https://github.com/apollographql/apollo-client/blob/48a224d504487ea480bce95685402d40c4a90eea/packages/apollo-cache-inmemory/src/readFromStore.ts#L124-L130 https://github.com/apollographql/apollo-client/blob/48a224d504487ea480bce95685402d40c4a90eea/packages/apollo-cache-inmemory/src/readFromStore.ts#L144-L150 45c4169fa9bf0b4c61e3273254e851e05905a431 replaced the defaultMakeCacheKey function from optimism (which internally uses a WeakMap) with the newly introduced CacheKeyNode concept, causing these temporary stores to end up as key in one of the (strong) Maps, effectively leaking them.

The more watched queries you have, the bigger the problem because maybeBroadcastWatch calls diff for every single watcher. So we end up creating (and leaking) a temporary store for every single watcher for a single optimistic response (apart from the leakage, there is certainly some room for optimization here).

Not sure if this is the only problem or if there are other leaks as well. We saw a single mutation (that updates a single field) to cause a memory increase of 40-80MBs - not sure if such a huge increase can be caused by this problem alone.

@benjamn Since you were the main author of #3394, could you take a look at this? Let me know if you need more information!

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:18
  • Comments:19 (6 by maintainers)

github_iconTop GitHub Comments

6reactions
benjamncommented, Dec 19, 2018

Ok, apollo-cache-inmemory@1.3.12 has just been published to npm. Closing this issue now, perhaps somewhat optimistically, because I believe the memory leaks that were specifically related to optimistic responses have been addressed. Please open new issues for any other memory problems. Thanks!

2reactions
barbalexcommented, Dec 19, 2018

O.k., tested with optimistic response. Here comes the test of clicking an option repeatedly and watching memory usage:

  • Without optimistic response: Memory rises, beginning at 70 MB. Stopps rising and drops a little at 400 MB after about 20 clicks. After some inactivity falls back to 70MB.
  • With optimistic response, without this update: Memory keeps rising, beginning at 250 MB. App crashing at 2GB after 80 clicks.
  • With optimistic response, with this update: Memory keeps rising, beginning at 70 MB. Does not seem to rise over 350 MB. After some inactivity falls back to 70MB.

The difference between using optimistic response and not may be accidental, as the numbers vary. But the update definitely seems to solve the crashing when using optimistic response in my app.

Read more comments on GitHub >

github_iconTop Results From Across the Web

angular - Unsubscribing / deleting component cancels my pending ...
Last few weeks I have been dealing with memory leaks in my angular app. ... thus I have implemented an optimistic response, sending...
Read more >
Optimistic mutation results - Apollo GraphQL Docs
If our app is wrong (e.g., the GraphQL server returns an unchanged Comment due to an error), the UI will automatically update to...
Read more >
ferry_cache | Dart Package - Pub.dev
A normalized, strongly typed, optimistic cache for GraphQL Operations and Fragments. ... fix(ferry_cache): fix memory leak when calling .watch().
Read more >
[PATCH 3.12 040/142] s390/locking: Reenable optimistic spinning ...
next prev parent reply other threads:[~2014-09-26 10:31 UTC|newest] ... Slaby 2014-09-26 9:43 ` [PATCH 3.12 014/142] mei: nfc: fix memory leak in error...
Read more >
Memory leak? - 🛎️ Get Help - Hubitat
Memory leak? ... Laymans answer please if there is such a thing in this case. ... pair (x 100) at 5 minutes per...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found