question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

left it running for some hours, heap kept increasing,

each time I refresh, there is a new connection to the server and each of the connections is taking up a few % of memory to handle on computer.

image image


[17940:000001EA465B16D0]  8798291 ms: Mark-sweep (reduce) 4082.6 (4104.9) -> 4082.2 (4105.7) MB, 2971.9 / 0.0 ms  (average mu = 0.215, current mu = 0.116) allocation failure scavenge might not succeed
[17940:000001EA465B16D0]  8801265 ms: Mark-sweep (reduce) 4083.2 (4102.7) -> 4082.9 (4104.4) MB, 2889.3 / 0.0 ms  (average mu = 0.123, current mu = 0.028) allocation failure scavenge might not succeed


<--- JS stacktrace --->

FATAL ERROR: MarkCompactCollector: young object promotion failed Allocation failed - JavaScript heap out of memory
 1: 00007FF61A5C058F napi_wrap+109311
 2: 00007FF61A5652B6 v8::internal::OrderedHashTable<v8::internal::OrderedHashSet,1>::NumberOfElementsOffset+33302
 3: 00007FF61A566086 node::OnFatalError+294
 4: 00007FF61AE3153E v8::Isolate::ReportExternalAllocationLimitReached+94
 5: 00007FF61AE163BD v8::SharedArrayBuffer::Externalize+781
 6: 00007FF61ACC084C v8::internal::Heap::EphemeronKeyWriteBarrierFromCode+1516
 7: 00007FF61ACAB48B v8::internal::NativeContextInferrer::Infer+59243
 8: 00007FF61AC909BF v8::internal::MarkingWorklists::SwitchToContextSlow+57327
 9: 00007FF61ACA460B v8::internal::NativeContextInferrer::Infer+30955
10: 00007FF61AC9B72D v8::internal::MarkCompactCollector::EnsureSweepingCompleted+6269
11: 00007FF61ACA385E v8::internal::NativeContextInferrer::Infer+27454
12: 00007FF61ACA77EB v8::internal::NativeContextInferrer::Infer+43723
13: 00007FF61ACB1042 v8::internal::ItemParallelJob::Task::RunInternal+18
14: 00007FF61ACB0FD1 v8::internal::ItemParallelJob::Run+641
15: 00007FF61AC848D3 v8::internal::MarkingWorklists::SwitchToContextSlow+7939
16: 00007FF61AC9BBDC v8::internal::MarkCompactCollector::EnsureSweepingCompleted+7468
17: 00007FF61AC9A424 v8::internal::MarkCompactCollector::EnsureSweepingCompleted+1396
18: 00007FF61AC97F88 v8::internal::MarkingWorklists::SwitchToContextSlow+87480
19: 00007FF61ACC65D1 v8::internal::Heap::LeftTrimFixedArray+929
20: 00007FF61ACC86B5 v8::internal::Heap::PageFlagsAreConsistent+789
21: 00007FF61ACBD961 v8::internal::Heap::CollectGarbage+2033
22: 00007FF61ACBBB65 v8::internal::Heap::AllocateExternalBackingStore+1317
23: 00007FF61ACD5EE1 v8::internal::Factory::AllocateRawWithAllocationSite+193
24: 00007FF61ACDFC72 v8::internal::Factory::NewJSObjectFromMap+50
25: 00007FF61AB513ED v8::internal::JSObject::New+141
26: 00007FF61ADD6C29 v8::internal::Builtins::builtin_handle+289929
27: 00007FF61ADD58DD v8::internal::Builtins::builtin_handle+284989
28: 00007FF61ADD5563 v8::internal::Builtins::builtin_handle+284099
29: 00007FF61AEB9FCD v8::internal::SetupIsolateDelegate::SetupHeap+464173
30: 00007FF61AE4E7B1 v8::internal::SetupIsolateDelegate::SetupHeap+23825
31: 00007FF61AF18E60 v8::internal::SetupIsolateDelegate::SetupHeap+852928
32: 000000B52F3D0D1F
[nodemon] app crashed - waiting for file changes before starting...```

Issue Analytics

  • State:open
  • Created 2 years ago
  • Reactions:5
  • Comments:12 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
masterking32commented, Oct 30, 2021

Hi, Thank you for your response. I already tested the last version and version 2.1.9. (I used this library for more than 1 year) Both have the same issue, But I think the last version has more memory issues. Lemme compare the changes from 2.1.8 to 2.1.9. If you want I can test it too.


Update: Ok, I checked versions 2.1.8 to 2.1.9, I think there is not an important update. (There is two commits, https://github.com/illuspas/Node-Media-Server/commit/9eaa2c83075b2530168dbf2484be9cdfe5dbfeac and https://github.com/illuspas/Node-Media-Server/commit/dce482e8c1ae579d33d4623b20002283f364f3f0 ) But, The issue existed on 2.1.9, But I think, There is more memory leak in version 2.3.8. @hthetiot

1reaction
hthetiotcommented, Oct 29, 2021

Can you try version node-media-server@2.1.8 see if that a leak that was introduced after this version @masterking32

Read more comments on GitHub >

github_iconTop Results From Across the Web

Memory Leaks (OOM) and the TDS 737 - FlightSim.Com
Hey all, Recently started flying the TDS 737 (all varients) and noticed that when only flying this particular aircraft, my systems memory usage...
Read more >
Memory leak - Wikipedia
In computer science, a memory leak is a type of resource leak that occurs when a computer program incorrectly manages memory allocations in...
Read more >
A. A Techniques for Out-of-Memory Testing - O'Reilly
This appendix describes some techniques for testing and fixing out-of-memory (OOM) robustness issues and memory leaks. OOM robustness means that when some ...
Read more >
Understanding JVM Memory Management | Dhaval Shah
Everyone of us as Software Engineers would have experienced memory leaks, OOM errors in our Java/JVM applications?
Read more >
Apache Spark: Performance Tuning Techniques - Medium
Memory leaks: OOM errors can also occur if there are memory leaks in the Spark application, where memory is allocated but not released ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found