question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

OutOfDirectMemoryError/large number of PooledUnsafeDirectByteBuf allocated

See original GitHub issue

We’re using Netty via Apache Camel (camel-netty4). The component acts as server and is only supposed to read messages from clients. Wenn starting our system a number of clients (around 20-25) connect to the TCP server. During that process we started to experience issues with available direct memory being exhausted quickly.

[io.netty.util.internal.OutOfDirectMemoryError - failed to allocate 16777216 byte(s) of direct memory (used: 469762048, max: 477626368)]io.netty.util.internal.OutOfDirectMemoryError: failed to allocate 16777216 byte(s) of direct memory (used: 469762048, max: 477626368) at io.netty.util.internal.PlatformDependent.incrementMemoryCounter(PlatformDependent.java:624) at io.netty.util.internal.PlatformDependent.allocateDirectNoCleaner(PlatformDependent.java:578) at io.netty.buffer.PoolArena$DirectArena.allocateDirect(PoolArena.java:686) at io.netty.buffer.PoolArena$DirectArena.newChunk(PoolArena.java:675) at io.netty.buffer.PoolArena.allocateNormal(PoolArena.java:237) at io.netty.buffer.PoolArena.allocate(PoolArena.java:221) at io.netty.buffer.PoolArena.allocate(PoolArena.java:141) at io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:262) at io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:179) at io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:170) at io.netty.buffer.AbstractByteBufAllocator.ioBuffer(AbstractByteBufAllocator.java:131) at io.netty.channel.DefaultMaxMessagesRecvByteBufAllocator$MaxMessageHandle.allocate(DefaultMaxMessagesRecvByteBufAllocator.java:73) at io.netty.channel.socket.nio.NioDatagramChannel.doReadMessages(NioDatagramChannel.java:242) at io.netty.channel.nio.AbstractNioMessageChannel$NioMessageUnsafe.read(AbstractNioMessageChannel.java:75) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:610) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:551) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:465) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:437) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:873) at java.lang.Thread.run(Thread.java:745)

After tinkering with JVM and Netty parameters (e.g. “-XX:MaxDirectMemorySize=1G -Dio.netty.allocator.pageSize=8192 -Dio.netty.allocator.maxOrder=10”) the whole shebang can start successfully. However, I am very sceptical about the amount of direct memory netty allocates. When pulling a heap dump of the server application I can see that there are 63 (!) instances of PooledUnsafeDirectByteBuf held in memory. Even after extended no-load periods none of those PooledUnsafeDirectByteBuf ever seem to be released.

Activating leak detection (“-Dio.netty.leakDetectionLevel=paranoid”) yielded nothing.

Am I missing something very obvious? Is this behaviour expected? What is the recommended way to cope with this?

Netty version

4.1.5

JVM version (e.g. java -version)

Java™ SE Runtime Environment (build 1.8.0_121-b13) Java HotSpot™ 64-Bit Server VM (build 25.121-b13, mixed mode)

OS version (e.g. uname -a)

Linux *** 3.12.59-60.45-default #1 SMP Sat Jun 25 06:19:03 UTC 2016 (396c69d) x86_64 x86_64 x86_64 GNU/Linux

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:11 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
lkoecommented, Jan 16, 2018

@Viyond no real conclusion. In the end we tweaked some parameters to make the immediate exception go away in our environments.

We set those JVM params: -XX:MaxDirectMemorySize=1G -Dio.netty.allocator.pageSize=8192 -Dio.netty.allocator.maxOrder=10 In my (limited) understanding this results to 8 MB of reserved direct memory per allocated buffer, thus fitting 128 buffers in 1G of direct memory. This number was high enough to not trigger the exception in our environment anymore. However, this is still all quite mysterious to me and in no way a recommendation of sorts.

0reactions
javisstcommented, Mar 27, 2018

We had the same issue for months now together with the async MongoDB driver and with activated SSL. The issue only occurred while bulk operations on MongoDB driver. The JVM settings

-XX:MaxDirectMemorySize=1G -Dio.netty.allocator.pageSize=8192 -Dio.netty.allocator.maxOrder=10

finally helped. Thank you very much!

I attached my stack trace. In case you’re interested I can provide you with a sample project.

2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT org.springframework.dao.InvalidDataAccessResourceUsageException: Unexpected exception; nested exception is com.mongodb.MongoInternalException: Unexpected exception 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at org.springframework.data.mongodb.core.MongoExceptionTranslator.translateExceptionIfPossible(MongoExceptionTranslator.java:91) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at org.springframework.data.mongodb.core.ReactiveMongoTemplate.potentiallyConvertRuntimeException(ReactiveMongoTemplate.java:2210) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at org.springframework.data.mongodb.core.ReactiveMongoTemplate.lambda$translateException$55(ReactiveMongoTemplate.java:2193) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at reactor.core.publisher.Flux.lambda$onErrorMap$24(Flux.java:5420) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onError(FluxOnErrorResume.java:88) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at reactor.core.publisher.FluxMap$MapSubscriber.onError(FluxMap.java:120) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.reactivestreams.client.internal.ObservableToPublisher$1.onError(ObservableToPublisher.java:73) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.async.client.AbstractSubscription.onError(AbstractSubscription.java:123) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.async.client.MongoIterableSubscription$1.onResult(MongoIterableSubscription.java:52) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.async.client.MongoIterableSubscription$1.onResult(MongoIterableSubscription.java:48) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.internal.async.ErrorHandlingResultCallback.onResult(ErrorHandlingResultCallback.java:49) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.async.client.AsyncOperationExecutorImpl$1$1.onResult(AsyncOperationExecutorImpl.java:70) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.internal.async.ErrorHandlingResultCallback.onResult(ErrorHandlingResultCallback.java:49) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.operation.FindOperation$3.onResult(FindOperation.java:819) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.operation.OperationHelper$ReferenceCountedReleasingWrappedCallback.onResult(OperationHelper.java:353) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.operation.CommandOperationHelper$1.onResult(CommandOperationHelper.java:385) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.internal.async.ErrorHandlingResultCallback.onResult(ErrorHandlingResultCallback.java:49) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor$2.onResult(DefaultServer.java:205) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.internal.async.ErrorHandlingResultCallback.onResult(ErrorHandlingResultCallback.java:49) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.CommandProtocolImpl$1.onResult(CommandProtocolImpl.java:100) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.DefaultConnectionPool$PooledConnection$1.onResult(DefaultConnectionPool.java:458) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.UsageTrackingInternalConnection$2.onResult(UsageTrackingInternalConnection.java:110) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.internal.async.ErrorHandlingResultCallback.onResult(ErrorHandlingResultCallback.java:49) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.InternalStreamConnection$2$1.onResult(InternalStreamConnection.java:364) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.InternalStreamConnection$2$1.onResult(InternalStreamConnection.java:359) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.InternalStreamConnection$MessageHeaderCallback$MessageCallback.onResult(InternalStreamConnection.java:628) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.InternalStreamConnection$MessageHeaderCallback$MessageCallback.onResult(InternalStreamConnection.java:618) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.InternalStreamConnection$5.failed(InternalStreamConnection.java:493) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.netty.NettyStream.readAsync(NettyStream.java:232) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.netty.NettyStream.handleReadResponse(NettyStream.java:266) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.netty.NettyStream.access$600(NettyStream.java:66) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.netty.NettyStream$InboundBufferHandler.exceptionCaught(NettyStream.java:333) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:285) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:264) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.AbstractChannelHandlerContext.fireExceptionCaught(AbstractChannelHandlerContext.java:256) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.handler.ssl.SslHandler.exceptionCaught(SslHandler.java:1030) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:285) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:264) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.AbstractChannelHandlerContext.fireExceptionCaught(AbstractChannelHandlerContext.java:256) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.DefaultChannelPipeline$HeadContext.exceptionCaught(DefaultChannelPipeline.java:1381) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:285) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:264) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.DefaultChannelPipeline.fireExceptionCaught(DefaultChannelPipeline.java:933) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.handleReadException(AbstractNioByteChannel.java:112) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:157) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:886) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at java.lang.Thread.run(Thread.java:748) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT Caused by: com.mongodb.MongoInternalException: Unexpected exception 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.InternalStreamConnection.translateReadException(InternalStreamConnection.java:542) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at com.mongodb.connection.InternalStreamConnection.access$1300(InternalStreamConnection.java:74) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT ... 25 common frames omitted 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT Caused by: io.netty.util.internal.OutOfDirectMemoryError: failed to allocate 65536 byte(s) of direct memory (used: 10438830, max: 10485760) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.util.internal.PlatformDependent.incrementMemoryCounter(PlatformDependent.java:640) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.util.internal.PlatformDependent.allocateDirectNoCleaner(PlatformDependent.java:594) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.buffer.UnpooledUnsafeNoCleanerDirectByteBuf.allocateDirect(UnpooledUnsafeNoCleanerDirectByteBuf.java:30) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.buffer.UnpooledUnsafeDirectByteBuf.<init>(UnpooledUnsafeDirectByteBuf.java:68) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.buffer.UnpooledUnsafeNoCleanerDirectByteBuf.<init>(UnpooledUnsafeNoCleanerDirectByteBuf.java:25) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.buffer.UnsafeByteBufUtil.newUnsafeDirectByteBuf(UnsafeByteBufUtil.java:595) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:327) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:185) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:176) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.buffer.AbstractByteBufAllocator.ioBuffer(AbstractByteBufAllocator.java:137) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.DefaultMaxMessagesRecvByteBufAllocator$MaxMessageHandle.allocate(DefaultMaxMessagesRecvByteBufAllocator.java:114) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130) 2018-03-26T17:08:00.29+0200 [APP/PROC/WEB/0] OUT ... 7 common frames omitted

netty version 4.1.22.Final

Read more comments on GitHub >

github_iconTop Results From Across the Web

No results found

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found