Memory leak
See original GitHub issueHello everyone. Following code results in memory leak:
import * as io from 'socket.io-client';
import Bluebird from 'bluebird';
import express from 'express';
import socketIo from 'socket.io';
import http from 'http';
import data from './socket.io.json';
describe('Socket.io', () => {
it('200 thousand requests', async () => {
const limit = 200 * 1000;
// add configurations for this test Edit Configurations -> Node options -> --expose-gc (WebStorm)
setInterval(() => {
global.gc();
console.error(new Date(), process.memoryUsage());
}, 1000);
// Server
const app = express();
const server = http.createServer(app);
server.listen(20017, 'localhost');
const ioMain = socketIo.listen(server);
ioMain.sockets.on('connection', (socket) => {
socket.on('some_route', async (args) => {
return;
});
});
// Client
const socket = io.connect('ws://localhost:20017', {
transports: ['websocket'],
rejectUnauthorized: false,
query: {key: 'key'}
});
await Bluebird.delay(3 * 1000);
for (let i = 0; i < limit; i++) {
socket.emit('some_route', ['some_data', 7777, data]);
}
await Bluebird.delay(3 * 1000);
});
});
If you run this test with limit 200 thousand requests you can see memoryUsage log:
2019-08-15T07:57:26.345Z { rss: 101449728,
heapTotal: 69914624,
heapUsed: 28566952,
external: 31683 }
2019-08-15T07:57:27.345Z { rss: 91463680,
heapTotal: 69914624,
heapUsed: 27574720,
external: 20968 }
2019-08-15T07:57:28.349Z { rss: 91475968,
heapTotal: 69914624,
heapUsed: 26643376,
external: 20968 }
2019-08-15T07:57:34.580Z { rss: 1773096960,
heapTotal: 921309184,
heapUsed: 866143944,
external: 819505496 }
Or If you run this test with limit 800 thousand requests:
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
<--- Last few GCs --->
[5377:0x102802800] 13083 ms: Scavenge 1396.7 (1424.6) -> 1396.2 (1425.1) MB, 2.0 / 0.0 ms (average mu = 0.155, current mu = 0.069) allocation failure
[5377:0x102802800] 13257 ms: Mark-sweep 1396.9 (1425.1) -> 1396.4 (1425.1) MB, 173.1 / 0.0 ms (average mu = 0.093, current mu = 0.028) allocation failure scavenge might not succeed
<--- JS stacktrace --->
==== JS stack trace =========================================
0: ExitFrame [pc: 0x3b4c160dbe3d]
Security context: 0x167f40a1e6e9 <JSObject>
1: hasBinary [0x167f40c16b71] [/Users/denis/api/node_modules/has-binary2/index.js:~30] [pc=0x3b4c1617e245](this=0x167fb3f9ad81 <JSGlobal Object>,obj=0x167f2e2dd279 <Object map = 0x167f3307a4f1>)
2: hasBinary [0x167f40c16b71] [/Users/denis/api/node_modules/has-binary2/index.js:~30] [pc=0x3b4c1617e0fa](this=0...
1: 0x10003c597 node::Abort() [/usr/local/bin/node]
2: 0x10003c7a1 node::OnFatalError(char const*, char const*) [/usr/local/bin/node]
3: 0x1001ad575 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/usr/local/bin/node]
4: 0x100579242 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/usr/local/bin/node]
5: 0x10057bd15 v8::internal::Heap::CheckIneffectiveMarkCompact(unsigned long, double) [/usr/local/bin/node]
6: 0x100577bbf v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/usr/local/bin/node]
7: 0x100575d94 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/usr/local/bin/node]
8: 0x10058262c v8::internal::Heap::AllocateRawWithLigthRetry(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/usr/local/bin/node]
9: 0x1005826af v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/usr/local/bin/node]
10: 0x100551ff4 v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationSpace) [/usr/local/bin/node]
11: 0x1007da044 v8::internal::Runtime_AllocateInNewSpace(int, v8::internal::Object**, v8::internal::Isolate*) [/usr/local/bin/node]
12: 0x3b4c160dbe3d
13: 0x3b4c1617e245
Process finished with exit code 134 (interrupted by signal 6: SIGABRT)
socket.io.json data you can get here: https://pastebin.com/uUeZJe6x
socket.io and socket.io-client version: 2.2.0
Issue Analytics
- State:
- Created 4 years ago
- Reactions:12
- Comments:20 (4 by maintainers)
Top Results From Across the Web
Memory leak - Wikipedia
In computer science, a memory leak is a type of resource leak that occurs when a computer program incorrectly manages memory allocations in...
Read more >What is Memory Leak? How can we avoid? - GeeksforGeeks
Memory leak occurs when programmers create a memory in heap and forget to delete it. The consequences of memory leak is that it...
Read more >Definition of memory leak - PCMag
When memory is allocated, but not deallocated, a memory leak occurs (the memory has leaked out of the computer). If too many memory...
Read more >Memory Leaks and Garbage Collection | Computerworld
DEFINITION A memory leak is the gradual deterioration of system performance that occurs over time as the result of the fragmentation of a...
Read more >Find a memory leak - Windows drivers - Microsoft Learn
A memory leak occurs when a process allocates memory from the paged or nonpaged pools, but doesn't free the memory.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
In my case, the memory bursts to 1.5GB after 30K clients (max of 3K active) get connected and transmitted the total number of 900K messages in 10 minutes. And the memory didn’t get released when all clients get disconnected (it remained 1.4GB even after calling the garbage collector manually). I tried to debug the memory leak in different ways and after a lot (4 days of debugging), find out that disabling
perMessageDeflate
will fix the issue. From ws-module API docs:So the main question here is that why
perMessageDeflate
is true by default in Socket-io?!Hope this help others.
For future readers, please note that the per-message deflate WebSocket extension is now disabled by default since version 3.0.0.
Documentation: https://socket.io/docs/v4/migrating-from-2-x-to-3-0/#Saner-default-values
I think we can now close this.