question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Possible worker memory leak (netty)

See original GitHub issue

Alluxio Version: 2.6.1

Describe the bug Memory leak detector found a leak

To Reproduce

Recent access records:
Created at:
        io.netty.buffer.AbstractByteBufAllocator.compositeDirectBuffer(AbstractByteBufAllocator.java:223)
        io.netty.buffer.AbstractByteBufAllocator.compositeDirectBuffer(AbstractByteBufAllocator.java:218)
        io.netty.buffer.AbstractByteBufAllocator.compositeBuffer(AbstractByteBufAllocator.java:193)
        alluxio.grpc.GrpcSerializationUtils.getByteBufFromReadableBuffer(GrpcSerializationUtils.java:153)
        alluxio.grpc.WriteRequestMarshaller.deserialize(WriteRequestMarshaller.java:91)
        alluxio.grpc.WriteRequestMarshaller.deserialize(WriteRequestMarshaller.java:35)
        alluxio.grpc.DataMessageMarshaller.parse(DataMessageMarshaller.java:63)
        io.grpc.MethodDescriptor.parseRequest(MethodDescriptor.java:307)
        io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
        io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
        io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:765)
        io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        java.lang.Thread.run(Thread.java:748)
2021-08-26 01:41:56,381 ERROR ResourceLeakDetector - LEAK: ByteBuf.release() was not called before it's garbage-collected. See https://netty.io/wiki/reference-counted-objects.html for more information.
Recent access records:
Created at:
        io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:385)
        io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:187)
        io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:178)
        io.netty.buffer.AbstractByteBufAllocator.buffer(AbstractByteBufAllocator.java:115)
        io.netty.handler.codec.ByteToMessageDecoder.expandCumulation(ByteToMessageDecoder.java:532)
        io.netty.handler.codec.ByteToMessageDecoder$1.cumulate(ByteToMessageDecoder.java:97)
        io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:274)
        io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
        io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
        io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
        io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
        io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
        io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
        io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
        io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:795)
        io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:475)
        io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378)

Expected behavior No leak Urgency Blocker

Additional context Add any other context about the problem here.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:6 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
apc999commented, Sep 7, 2021

@yuzhu can you check with @lilyzhoupeijie to hopefully close this ticket?

0reactions
yuzhucommented, Sep 10, 2021

@lilyzhoupeijie closing this for now, if this resurfaces, please open this issue again.

Read more comments on GitHub >

github_iconTop Results From Across the Web

A Netty ByteBuf Memory Leak Story and the Lessons Learned
We were chasing a memory leak while refactoring our log receiver. After a major refactoring, we noticed a gradual decrease of free memory...
Read more >
Memory leaks when trying to get Netty server publish ...
HttpServerNetty. When I add the advanced leak detection I am seeing my worker threads are leaking memory when publishing messages to kafka.
Read more >
Memory accumulated in netty PoolChunk - Stack Overflow
One instance of "io.netty.buffer.PoolChunk" loaded by "jdk.internal.loader.ClassLoaders$AppClassLoader @ 0x7f04ba5b0" occupies 16,793,832 (32.78 ...
Read more >
Understanding Memory Leaks in Java - Baeldung
A Memory Leak is a situation where there are objects present in the heap that are no longer used, but the garbage collector...
Read more >
Memory leak messages when shutting down Apache Tomcat ...
This is very likely to create a memory leak." and "SEVERE: The web application [AM] created a ThreadLocal with key of type [...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found