question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

UploadFromStream throws OutOfMemoryException for large streams

See original GitHub issue

Which service(blob, file, queue, table) does this issue concern?

Azure Blob Storage

Which version of the SDK was used?

Microsoft.Azure.Storage.Blob 10.0.3

Which platform are you using?

.NET Framework (4.7.2)

What problem was encountered?

Cannot upload large strems (for example 2GB) using CloudBlockBlob.UploadFromStream. The SDK throws exception:

System.OutOfMemoryException: Exception of type ‘System.OutOfMemoryException’ was thrown.

Or sometimes it throws:

Microsoft.Azure.Storage.StorageException: Exception of type ‘System.OutOfMemoryException’ was thrown.

Also for smaller streams, it consumes a lot of memory. Also CloudBlockBlob.UploadFromFile shows the same behavior.

How can we reproduce the problem in the simplest way?

To reproduce the problem in a Test project:

  1. Create a .NET Framework Test project (for example MS Test).
  2. Install Microsoft.Azure.Storage.Blob NuGet package. (Latest stable is 10.0.3)
  3. Copy the following code, update the connection string and run the test. (The connection string in the following code is connection string of Azure Storage Emulator. The issue is reproducible using both Microsoft Azure and also Emulator.

Here is the code:

using System;
using System.IO;
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Blob;
using Microsoft.VisualStudio.TestTools.UnitTesting;

[TestClass]
public class UploadFromStreamTest
{
    const long GB = 1024 * 1024 * 1024;
    [TestMethod]
    [DataRow(2 * GB)]
    public void TestLargeBlob(long size)
    {
        var storageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=http://127.0.0.1:10000/devstoreaccount1;TableEndpoint=http://127.0.0.1:10002/devstoreaccount1;QueueEndpoint=http://127.0.0.1:10001/devstoreaccount1;");
        var blobClient = storageAccount.CreateCloudBlobClient();
        var container = blobClient.GetContainerReference("container");
        container.CreateIfNotExists();
        var blockBlob = container.GetBlockBlobReference($"{Guid.NewGuid()}");
        var fileName = Path.GetTempFileName();
        using (var fs = File.OpenWrite(fileName))
            fs.SetLength(size);
        try
        {
            using (var stream = File.OpenRead(fileName))
            {
                blockBlob.UploadFromStream(stream);
                Assert.IsTrue(blockBlob.Exists());
            }
        }
        finally
        {
            File.Delete(fileName);
            blockBlob?.DeleteIfExists();
        }
    }
}

Have you found a mitigation/solution?

We can use the following settings:

blobClient.DefaultRequestOptions.StoreBlobContentMD5 = false;
blockBlob.StreamWriteSizeInBytes = 5 * 1024 * 1024;  //In fact larger than (4 MB + 1B)

But it disables the MD5 hash calculation and the ContentMD5 property of the uploaded blob will be empty.

WindowsAzure.Storage package doesn’t have this problem

WindowsAzure.Storage NuGet package doesn’t have this problem and large stream could successfully uploaded.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

2reactions
seanmcc-msftcommented, Aug 5, 2019

Fixed in v11.0.0.

1reaction
kfarmer-msftcommented, Jul 26, 2019

@r-aghaei

As I related in #894, I found cases where internal stream buffers weren’t being disposed. Fixing that brings memory usage back down to < 30 MB (< 80 MB in a test I wrote for a 50 GB upload blob case).

I need to inspect the File library case specifically, but I’m planning to get these fixes in our next release if possible.

Read more comments on GitHub >

github_iconTop Results From Across the Web

c# - Exception of type 'System.OutOfMemoryException' was ...
It looks like you are trying to read the entire 8 GB file into memory into a single byte array. That probably isn't...
Read more >
uploading large files throws error - Your remarks, ideas etc.
OutOfMemoryException ' was thrown." when uploading relatively large video files (~260Mb) to blob storage using this provider.
Read more >
Out of memory exception with file upload
When uploading a file of around 1.2GB I get: Exception of type 'System.OutOfMemoryException' was thrown. at System.IO.MemoryStream.set_Capacity( ...
Read more >
OutOfMemoryException Class (System)
The exception that is thrown when there is not enough memory to continue the execution of a program.
Read more >
Untitled
... a stream, therefore simply use it as the parameter of UploadFromStream(). ... for years UploadFromStream throws OutOfMemoryException for large streams ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found