question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Passing /dev/fd/0 to az storage blob upload no longer works

See original GitHub issue

Description

With the azure CLI 2.0.12 and below, it was possible to upload a stream from STDIN to blob storage by an invocation like the following -

az storage blob upload -f /dev/fd/0 --max-connections 1 ....

Obviously this only worked on platforms that have /dev/fd, but otherwise this worked nicely to upload unknown-length streams we didn’t want to buffer to disk.

This appears to have been broken since https://github.com/Azure/azure-cli/commit/7d51854504343b298f27d401e7bbf3bef8ffa039 - I suspect it’s something to do with the MIME type detection. It now only uploads zero-byte file but otherwise exits successfully. Eyeballing the code suggests that passing --content-type should help, but this appeared to have no effect


Environment summary

Install Method: How did you install the CLI? (e.g. pip, interactive script, apt-get, Docker, MSI, nightly)
Answer here: pip install azure-cli

CLI Version: What version of the CLI and modules are installed? (Use az --version)
Answer here:

azure-cli (2.0.15)

acr (2.0.11)
acs (2.0.14)
appservice (0.1.14)
batch (3.1.2)
billing (0.1.4)
cdn (0.0.7)
cloud (2.0.7)
cognitiveservices (0.1.7)
command-modules-nspkg (2.0.1)
component (2.0.7)
configure (2.0.10)
consumption (0.1.4)
container (0.1.9)
core (2.0.14)
cosmosdb (0.1.12)
dla (0.0.11)
dls (0.0.13)
eventgrid (0.1.3)
feedback (2.0.6)
find (0.2.6)
interactive (0.3.8)
iot (0.1.11)
keyvault (2.0.8)
lab (0.0.10)
monitor (0.0.9)
network (2.0.13)
nspkg (3.0.1)
profile (2.0.11)
rdbms (0.0.6)
redis (0.2.8)
resource (2.0.13)
role (2.0.11)
servicefabric (0.0.2)
sf (1.0.7)
sql (2.0.10)
storage (2.0.13)
vm (2.0.13)

OS Version: What OS and version are you using?
Answer here: CentOS 7

Shell Type: What shell are you using? (e.g. bash, cmd.exe, Bash on Windows)
Answer here: Bash

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Reactions:1
  • Comments:17 (11 by maintainers)

github_iconTop GitHub Comments

2reactions
kevinoidcommented, Oct 14, 2018

I believe this bug is still present in azure-cli (2.0.47) with storage (2.2.2) from the Debian package (2.0.47-1~stretch). Here’s a quick way to reproduce the issue (requires environment variables $AZURE_STORAGE_ACCOUNT and $container to be set by the user):

date | az storage blob upload -c $container -n date.txt -f /dev/fd/0
az storage blob download -c $container -n date.txt -f date.txt
cat date.txt

On my system, date.txt is empty when it should contain the output of date at the time of upload.

I think the source of the previous confusion is that /dev/fd/0 works when redirected from a file (since seek(3) and stat(2) work) but does not work when stdin is a pipe. So azure storage blob upload < file.iso works, while cat file.iso | azure storage blob upload does not. This is also occurs with a named pipe:

mkfifo date.fifo
date > date.fifo &
az storage blob upload -c $container -n date.txt -f date.fifo
az storage blob download -c $container -n date.txt -f date.txt
cat date.txt

It would be great if azure-cli could support pipes without buffering, so that large intermediate files could be avoided. If not, it would be nice if azure-cli could print a reasonable error message, rather than silently discarding data.

Thanks, Kevin

0reactions
troydaicommented, Oct 15, 2018
Read more comments on GitHub >

github_iconTop Results From Across the Web

Azure blob storage upload fail - Microsoft Q&A
I have an issue uploading a large file to my azure storage blob thru azure storage explorer. The file is approx. 160GB.
Read more >
How do I pass the correct parameters to the 'az storage fs ...
Have you tried using az interactive to test the new command and available parameters, the autocompletion could be handy.
Read more >
Error from az storage blob upload-batch command in Azure ...
az storage blob upload -batch -s /home/vsts/work/r1/a/www2site/xxxxxxx -d ... have worked, it no longer failed at the usual file, and seems to have...
Read more >
How to Use Blob Storage via Azure File Storage
Monitor your consumption and delete resources when you no longer intend to use them. Run the commands below to create a new container...
Read more >
Manage Blob Storage with the Azure CLI - Mark Heath
Uploading a file into your container is easy with the az storage blob ... We do that with az storage blob generate-sas ,...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found