Blob upload results in excessive memory usage/leak
See original GitHub issueWhich service(blob, file, queue, table) does this issue concern?
Blob
Which version of the SDK was used?
Microsoft.Azure.Storage.Blob -Version 10.0.3
Which platform are you using? (ex: .NET Core 2.1)
.net core 2.2.300
What problem was encountered?
OutOfMemory / excessive memory usage
How can we reproduce the problem in the simplest way?
namespace ConsoleAppAzureBlobMemoryIssue
{
class Program
{
static void Main(string[] args)
{
CloudStorageAccount account = CloudStorageAccount.Parse("StorageConnectionString");
CloudBlobClient serviceClient = account.CreateCloudBlobClient();
var container = serviceClient.GetContainerReference("backup");
container.CreateIfNotExistsAsync().Wait();
string path = @"c:\folder\gigafile.zip";
CloudBlockBlob blob = container.GetBlockBlobReference(Path.GetFileName(path));
blob.UploadFromFileAsync(path).Wait();
}
}
}
Run the program, open Task Manager and “enjoy” how the memory usage of dotnet.exe explodes as the file gets uploaded, the memory usages increases directly with amount of bytes uploaded sofar. So if you attempt to upload a 50GB file you will need +50GB of free memory.
Have you found a mitigation/solution?
Last working version is WindowsAzure.Storage -Version 9.3.3, newer versions contains the issue.
Issue Analytics
- State:
- Created 4 years ago
- Comments:34 (9 by maintainers)
Top Results From Across the Web
Performance tuning for uploads and downloads with Azure ...
Learn how to tune your uploads and downloads for better performance with Azure Storage client library for .NET.
Read more >554678 - Blobs use too much memory even with Blob.close ...
When we run the download process, the physical memory increasing from 4.3G-6.8G (tested on 32G RAM, memory usage usually drop down after 6.8G,...
Read more >Possible memory leak for Azure blob storage
rclone constantly increasing memory consumption over time when uploading millions files to the Azure blob storage.
Read more >High memory usage when uploading and manipulating ...
I can open about:memory, click "Minimize memory usage" to trigger a garbage ... errors once more than 500 MB of blob data accumulates....
Read more >Dos and Don'ts for Streaming File Uploads to Azure Blob ...
Implement file uploads the wrong way, and you may end up with memory leaks, server slowdowns, out-of-memory errors, and worst of all, unhappy...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I’d be interested to know if there are any plans to fix this issue soon, or whether there is an underlying technical limitation that requires the entire file to be held in memory. I recently updated our application to the latest version of Microsoft.Azure.Storage.Blob following the advice given here, but this high memory usage is an absolute showstopper for us, and looks like I’m going to have to roll everything back to WindowsAzure.Storage 9.3.3.
Interestingly it doesn’t seem to be a problem if the source isn’t from a file. I can use
UploadFromStreamAsync
to copy from a stream from another large blob opened withOpenReadAsync
and memory usage stays well below the overall blob size.Hi @GFoley83 , if I recall correctly, it was fixed in both.
-Sean