question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

OutOfDirectMemoryError for large uploads using HttpPostMultipartRequestDecoder

See original GitHub issue

Expected behavior

With a HttpDataFactory that has useDisk=true, I thought files of any size could potentially be uploaded.

Actual behavior

Example error from below unit test:

io.netty.util.internal.OutOfDirectMemoryError: failed to allocate 4194304 byte(s) of direct memory (used: 62914560, max: 64487424)
	at io.netty.util.internal.PlatformDependent.incrementMemoryCounter(PlatformDependent.java:775)
	at io.netty.util.internal.PlatformDependent.reallocateDirectNoCleaner(PlatformDependent.java:748)
	at io.netty.buffer.UnpooledUnsafeNoCleanerDirectByteBuf.reallocateDirect(UnpooledUnsafeNoCleanerDirectByteBuf.java:34)
	at io.netty.buffer.UnpooledByteBufAllocator$InstrumentedUnpooledUnsafeNoCleanerDirectByteBuf.reallocateDirect(UnpooledByteBufAllocator.java:194)
	at io.netty.buffer.UnpooledUnsafeNoCleanerDirectByteBuf.capacity(UnpooledUnsafeNoCleanerDirectByteBuf.java:52)
	at io.netty.buffer.AbstractByteBuf.ensureWritable0(AbstractByteBuf.java:307)
	at io.netty.buffer.AbstractByteBuf.ensureWritable(AbstractByteBuf.java:282)
	at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1105)
	at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1098)
	at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1089)
	at io.netty.handler.codec.http.multipart.HttpPostMultipartRequestDecoder.offer(HttpPostMultipartRequestDecoder.java:351)
	at NettyUploadTest.itCanProcessLargeFiles(NettyUploadTest.java:46)
// snip

Steps to reproduce

Set -Xmx64m and run below unit test.

Minimal yet complete reproducer code (or URL to code)

import io.netty.buffer.ByteBuf;
import io.netty.buffer.Unpooled;
import io.netty.handler.codec.http.*;
import io.netty.handler.codec.http.multipart.DefaultHttpDataFactory;
import io.netty.handler.codec.http.multipart.FileUpload;
import io.netty.handler.codec.http.multipart.HttpDataFactory;
import io.netty.handler.codec.http.multipart.HttpPostMultipartRequestDecoder;
import org.junit.Test;

import java.nio.charset.StandardCharsets;
import java.util.Arrays;

import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;

public class NettyUploadTest {

    @Test
    public void itCanProcessLargeFiles() throws Exception {

        int fileSize = 100_000_000; // set Xmx to a number lower than this and it crashes
        int bytesPerChunk = 1_000_000;

        String prefix = "--861fbeab-cd20-470c-9609-d40a0f704466\n" +
            "Content-Disposition: form-data; name=\"image\"; filename=\"guangzhou.jpeg\"\n" +
            "Content-Type: image/jpeg\n" +
            "Content-Length: " + fileSize + "\n" +
            "\n";

        String suffix = "\n" +
            "--861fbeab-cd20-470c-9609-d40a0f704466--\n";

        HttpRequest request = new DefaultHttpRequest(HttpVersion.HTTP_1_1, HttpMethod.POST, "/upload");
        request.headers().set("content-type", "multipart/form-data; boundary=861fbeab-cd20-470c-9609-d40a0f704466");
        request.headers().set("content-length", prefix.length() + fileSize + suffix.length());

        HttpDataFactory factory = new DefaultHttpDataFactory(true);
        HttpPostMultipartRequestDecoder decoder = new HttpPostMultipartRequestDecoder(factory, request);
        decoder.offer(new DefaultHttpContent(Unpooled.wrappedBuffer(prefix.getBytes(StandardCharsets.UTF_8))));

        byte[] body = new byte[bytesPerChunk];
        Arrays.fill(body, (byte)1);
        for (int i = 0; i < fileSize / bytesPerChunk; i++) {
            ByteBuf content = Unpooled.wrappedBuffer(body, 0, bytesPerChunk);
            decoder.offer(new DefaultHttpContent(content)); // **OutOfMemory here**
            content.release();
        }

        decoder.offer(new DefaultHttpContent(Unpooled.wrappedBuffer(suffix.getBytes(StandardCharsets.UTF_8))));
        decoder.offer(new DefaultLastHttpContent());
        FileUpload data = (FileUpload) decoder.getBodyHttpDatas().get(0);
        assertThat((int)data.length(), is(fileSize));
        assertThat(data.get().length, is(fileSize));

        factory.cleanAllHttpData();

    }

}

Netty version

Tested on 4.1.56 and 4.1.58.

JVM version (e.g. java -version)

jdk1.8.0_162 and 12

OS version (e.g. uname -a)

Windows 10

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:7 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
fredericBregiercommented, Feb 15, 2021

@danielflower or maybe wait for #11001 which fixes also this PARANOID issue, and improves without this level also performances (about 4 times)… 😉

1reaction
fredericBregiercommented, Jan 29, 2021

Hi, the issue is partially related but yet an issue.

  • the file is totally in the buffer while reading: previously it could be in memory but by chunk
  • once the file is over, if the “disk” based HttpData is used, the buffer is written to a temprary file, therefore leaving the memory free
  • Note that another bug was fixed in the same time since before buffers were free (discardReadBytes) wrongly

I agree that this should be changed to adapt the solution with the new way to find the delimiter. Perhaps this? Currently:

If it is written (in Disk mode or Mixed mode and if size is greater than limit), the buffer might be cleared (free), so one could try to change this such as:

Not sure it will work or easy to implement, but I think the idea is there…

Read more comments on GitHub >

github_iconTop Results From Across the Web

Ktor Webserver - OutOfDirectMemoryError on large file upload ...
I am working on an embedded device. This means that I do not have much system memory (1GB RAM). Now I try to...
Read more >
Optimize uploads of large files to Amazon S3 - AWS
I want to upload large files (1 GB or larger) to Amazon Simple Storage Service ... If you're using the AWS Command Line...
Read more >
Share Large Files | School of Medicine IT
The File Upload Tool will upload your large files to public web space and then e-mail you a link to that file, as...
Read more >
Send Large Files and Videos Easily & Securely - Box
With Box you can send large files and videos securely with a single click — to anyone from any of your devices. Sign...
Read more >
Large File Ingest and Retrieval - LYRASIS Wiki
Large files can be uploaded via the REST API. Transfer times for uploading to the repository via the REST API are about the...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found