Random authentication error when using account SAS to upload large file to blob storage
See original GitHub issueHi guys,
I am seeing random authentication issues when using account SAS to upload large files ( >20 GB) to blob storage.
azure.common.AzureHttpError: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
<?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:aa7f238b-0001-0006-3d71-cf3f37000000
Time:2017-05-18T00:54:58.0436290Z</Message><AuthenticationErrorDetail>sr is mandatory. Cannot be empty</AuthenticationErrorDetail></Error>
The authentication error detail showed sr is mandatory. Cannot be empty
. Per Account SAS documentation, account SAS doesn’t have the sr
parameter while service SAS has it. Using the same token, I was able to successfully upload 2 large files at different time, but it also failed many times due to authentication issue. The SAS token is given to me, and I do not have direct access to the account, so I may not be able to change or generate a service SAS.
Could you please take a look if this is a bug or simply I just miss something?
Thanks
Issue Analytics
- State:
- Created 6 years ago
- Comments:16 (5 by maintainers)
Top Results From Across the Web
Azure blob storage upload fail - Microsoft Q&A
I have an issue uploading a large file to my azure storage blob thru azure storage explorer. The file is approx. 160GB.
Read more >How to upload a 10 Gb file using SAS token - Stack Overflow
To copy large files to a blob you can use azcopy: Authenticate first: azcopy login.
Read more >SAS 9.4 M7 access to Azure Synapse SQL Pool ( SQL DW)
The Bulk load process uploads the large/bulk data into an ADLS2 Blob Container before using the COPY INTO statement to push the data...
Read more >Azure Blob Storage vs File Storage | Serverless360
There are various options available in the Azure Storage Account for storing user data ... Large File Uploading: Why do we use BLOB...
Read more >Az storage blob copy example - Sport Castellano Reports
I'm trying to upload a file to an Azure Blob storage using AzCopy, ... to import or export large amounts of blob data...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I’m having the same intermittent failure with uploading large files, although most of my files are a little smaller. I’ve seen this failure occur with files as small as 2-5 GB.
Thanks @neh. Just an update, this appears to be an issue with out retry policies when using SAS (which will often get triggered with bigger uploads). We are looking into possible fixes and getting that out soon.