question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Memory leak in consumer when using lz4 compression

See original GitHub issue

I’ve noticed a problem with unbounded memory usage in the consumer when handling lz4-compressed payloads. The symptoms point to a memory leak of some description. I’ve created a test case which should allow easy reproduction of the issue.

I’ve tried experimenting with various consumer settings such as fetch_max_bytes , max_partition_fetch_bytes and max_in_flight_requests_per_connection in an attempt to lessen the buffering requirements of each individual request to Kafka. In all cases, memory usage of consumer processes has continued to rise until such a point that those processes are killed by the host system.

I can additionally confirm that this issue is present in the latest PyPI release (1.3.2) as well as master and that the issue only manifests when using lz4 compression specifically. gzip appears to be working fine, for example.

Any help is greatly appreciated. Thanks!

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Reactions:1
  • Comments:13 (7 by maintainers)

github_iconTop GitHub Comments

2reactions
dpkpcommented, Mar 14, 2017

Given the complexity here w/ both options, I think I’m going to defer this until after the next release (which I’m hoping to push out in the next few days).

2reactions
dpkpcommented, Mar 9, 2017

We might consider switching to https://github.com/python-lz4/python-lz4 for primary lz4 support going forward. It seems to be more active.

Read more comments on GitHub >

github_iconTop Results From Across the Web

LZ4 – Extremely fast compression - Hacker News
Strictly speaking, LZ4 data decompression (typically 3 GB/sec) is slower than memcpy (typically 12 GB/sec). But when using e.g. 128 CPU cores, ...
Read more >
lz4 decompression giving running out of memory
I am using the library github.com/bkaradzic/go-lz4 for lz4 decompression. In lz4 decode command, I am getting running out of memory.
Read more >
Optimize application memory usage on Amazon ElastiCache ...
Most compression algorithms (such as Gzip, LZ4, Snappy) are not very effective to compress small strings (such a person name or a URL)....
Read more >
Effects of Batch Size, Acknowledgments, and Compression on ...
The results show that any compression increases throughput. For this particular set of tests, throughput is increased by more than double for ...
Read more >
OptimizeResources / Compression Memory Leak
During the process, it is necessary for me to compress this document. This is the code I am using var oo = new...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found