question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Random authentication error when using account SAS to upload large file to blob storage

See original GitHub issue

Hi guys,

I am seeing random authentication issues when using account SAS to upload large files ( >20 GB) to blob storage.

azure.common.AzureHttpError: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
<?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:aa7f238b-0001-0006-3d71-cf3f37000000
Time:2017-05-18T00:54:58.0436290Z</Message><AuthenticationErrorDetail>sr is mandatory. Cannot be empty</AuthenticationErrorDetail></Error>

The authentication error detail showed sr is mandatory. Cannot be empty. Per Account SAS documentation, account SAS doesn’t have the sr parameter while service SAS has it. Using the same token, I was able to successfully upload 2 large files at different time, but it also failed many times due to authentication issue. The SAS token is given to me, and I do not have direct access to the account, so I may not be able to change or generate a service SAS. Could you please take a look if this is a bug or simply I just miss something?

Thanks

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:16 (5 by maintainers)

github_iconTop GitHub Comments

2reactions
nehcommented, May 24, 2017

I’m having the same intermittent failure with uploading large files, although most of my files are a little smaller. I’ve seen this failure occur with files as small as 2-5 GB.

1reaction
rambhocommented, May 25, 2017

Thanks @neh. Just an update, this appears to be an issue with out retry policies when using SAS (which will often get triggered with bigger uploads). We are looking into possible fixes and getting that out soon.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Azure blob storage upload fail - Microsoft Q&A
I have an issue uploading a large file to my azure storage blob thru azure storage explorer. The file is approx. 160GB.
Read more >
How to upload a 10 Gb file using SAS token - Stack Overflow
To copy large files to a blob you can use azcopy: Authenticate first: azcopy login.
Read more >
SAS 9.4 M7 access to Azure Synapse SQL Pool ( SQL DW)
The Bulk load process uploads the large/bulk data into an ADLS2 Blob Container before using the COPY INTO statement to push the data...
Read more >
Azure Blob Storage vs File Storage | Serverless360
There are various options available in the Azure Storage Account for storing user data ... Large File Uploading: Why do we use BLOB...
Read more >
Az storage blob copy example - Sport Castellano Reports
I'm trying to upload a file to an Azure Blob storage using AzCopy, ... to import or export large amounts of blob data...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found