question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Possible memory leak while sending a message with headers

See original GitHub issue

Hi Guys

Environment Information  - OS [centos AWS 4.14.181-142.260.amzn2.x86_64]:  - Node Version [v10.11.0]:  - NPM Version [6.4.1]:  - C++ Toolchain [g++ (GCC) 7.3.1 20180712 (Red Hat 7.3.1-9)]:  - node-rdkafka version [2.9.1]:

Steps to Reproduce

We have a system running on the EKS(AWS) platform. The node is running in the pod with the cluster mode enabled. The system has been running for half a year without any mem problem. Recently we started observing a memory leak.  Per our code analysis we managed to isolate the problem which apparently is caused when we add a header to a Kafka message. That’s the only code diff we have, notice the problem is replicated even with header null.

To replicate the issue:

Create a simple Producer client and send a message with a header (no matter what value, event null reproduces the mem leak). The memory grows very slow but constantly, our scale is ~250 messages per second. We are using node exporter Prometheus to collect the node statistic.

Screen Shot 2020-10-04 at 12 28 43

The chart shows the short period of time, but if you let him run for a few days. the memory doesn’t stop growing.

Screen Shot 2020-10-04 at 13 19 19

node-rdkafka Configuration Settings

we have two MSK Kafka brokers with TLS connections

Producer:

this.kproducer = new Kafka.Producer({
     'metadata.broker.list': this.producers.toString(),
     'message.send.max.retries': 10,
     'retry.backoff.ms': 1000,
     'compression.codec': 'snappy',
     'linger.ms': this.linger_ms,
     'security.protocol': 'ssl',
     'ssl.ca.location': 'kafka_ssl/ssl.ca',
     'socket.keepalive.enable': true,
     'dr_cb': process.env.DEBUG
});

The code for sending a message

Without mem leak:

return this.kproducer.produce(
    // Topic to send the message to
    this.successTopics,
    // optionally we can manually specify a partition for the message
    // this defaults to -1 - which will use librdkafka's default partitioner (consistent random for keyed messages, random for unkeyed messages)
    murmur.murmur2(event.id) % this.partitions,
    // Message to send. Must be a buffer
    Buffer.from(JSON.stringify(event)),
    // for keyed messages, we also specify the key - note that this field is optional
    "",
    // you can send a timestamp here. If your broker version supports it,
    // it will get added. Otherwise, we default to 0
    Date.now(),
    // you can send an opaque token here, which gets passed along
    // to your delivery reports
);

With mem leak

return this.kproducer.produce(
    // Topic to send the message to
    this.successTopics,
    // optionally we can manually specify a partition for the message
    // this defaults to -1 - which will use librdkafka's default partitioner (consistent random for keyed messages, random for unkeyed messages)
    murmur.murmur2(message.id) % this.partitions,
    // Message to send. Must be a buffer
    Buffer.from(JSON.stringify(event)),
    // for keyed messages, we also specify the key - note that this field is optional
    "",
    // you can send a timestamp here. If your broker version supports it,
    // it will get added. Otherwise, we default to 0
    Date.now(),
    // you can send an opaque token here, which gets passed along
    // to your delivery reports
     null,
     null, // note, the header could be [{"header-name":"header-value"}]
);

I tried many experiments sending different header values with different payload sizes, it seems like it has no effect on the rate of memory growth, what is important is by the mere fact that you send a header.

I tried troubleshooting it using heap dump snapshots, although without big success.

Additional context

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:8 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
DenisVoloshincommented, Feb 23, 2021

Hi iradul, I’m still in the middle of running a long duration test, although at first glace,it seems like the use of undefined value as opaque works. The test has been running for a day and I don’t observe any significant memory raise. I think the issue could be close. Thanks a lot for your assistance, very appreciate it

0reactions
stale[bot]commented, Jun 11, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Memory leak while sending response from rebus handler
I understand that a memory leak is a grave concern, but my belief is that it is unlikely that Rebus should contain a...
Read more >
Fixing memory leaks in web applications | Read the Tea Leaves
(Of course, a server-rendered website can also leak memory on the server side. But it's extremely unlikely to leak memory on the client...
Read more >
How to troubleshoot a memory leak - BizTalk Server
Describes troubleshooting steps, important memory usage considerations, and known memory-related issues in BizTalk Server.
Read more >
Change log - IBM
Safer Payments dropped the binary header before forwarding the message to a ... A defect might cause the memory leak or crash when...
Read more >
3 Troubleshoot Memory Leaks - Java - Oracle Help Center
If your application's execution time becomes longer and longer, or if the operating system seems to be performing slower and slower, this could...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found