question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Failure to upload large file (>200GB)

See original GitHub issue

Hi guys,

I have been having issue with uploading large file (>200GB) with the library. According to the documentation, with the REST API version after 2016-05-31, the size limit can be up to 4.75TB (100 MB chunk x 50000 blocks). I have been playing with the chunk size limit (default is 4MB). When I changed the MAX_BLOCK_SIZE to be larger than 10MB, i saw ReadTimeout AzureException: ReadTimeout: HTTPSConnectionPool(host='host.blob.core.windows.net', port=443): Read timed out. (read timeout=20) When the block size was 10MB and validate_content=True, I got md5 mismatched AzureHttpError: The MD5 value specified in the request did not match with the MD5 value calculated by the server. When I kept the block size the same (10MB) and disable md5 check, the file got uploaded successfully (263GB). I downloaded the file down and manually performed md5sum on the whole file, the md5 matched with the file on local drive. I updated the library version to 0.36.0, but it didn’t make any difference.

Could you please take a look if this is a bug/service limit, or if I did not do it correctly?

Thanks

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:5 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
zezha-msftcommented, Sep 6, 2017

Hi @thanhnhut90, thank you for bring this to our attention!

Concerning the ReadTimeout, I would suggest to increase the socket_timeout (in BlockBlobService’s constructor) to higher than 20 seconds, and see if that helps with your situation.

And for the MD5 mismatch, I will try to reproduce your issue, and update there. 😃

0reactions
zezha-msftcommented, Nov 3, 2017

@thanhnhut90 This was fixed in azure-storage-blob v0.37.1. Please let me know if you encounter any other problem. 😃

Read more comments on GitHub >

github_iconTop Results From Across the Web

Resolve issues with uploading large files in Amazon S3
I'm trying to upload a large file (1 GB or larger) to Amazon Simple Storage Service (Amazon S3) using the console. However, the...
Read more >
How To Upload Large Files To Google Drive Quickly - MASV
Learn the best ways to upload to Google Drive large files quickly, including reasons why large uploads to Google Drive can fail. Read...
Read more >
Storage and upload limits for Google Workspace
Individual users can only upload 750 GB each day between My Drive and all shared drives. Users who reach the 750-GB limit or...
Read more >
How best to get 150 GB uploaded to OneDrive.
What you should do instead is drag and drop 5 GBs into the OneDrive folder in File Explorer, let it upload each set....
Read more >
What is the Dropbox file size limit?
Need to upload large files to Dropbox? Learn about file size limits for the Dropbox desktop and mobile apps, dropbox.com, and the Dropbox...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found