question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

az storage blob upload should read from pipes

See original GitHub issue

Description

It would be very nice if “az storage blob upload” could support reading from pipes instead of trying to read from an existing file.

Environment summary

Install Method: How did you install the CLI? (e.g. pip, interactive script, apt-get, Docker, MSI, nightly)
Answer here: apt-get

CLI Version: What version of the CLI and modules are installed? (Use az --version)
Answer here: azure-cli (2.0.6)

acr (2.0.4) acs (2.0.6) appservice (0.1.6) batch (2.0.4) cdn (0.0.2) cloud (2.0.2) cognitiveservices (0.1.2) command-modules-nspkg (2.0.0) component (2.0.4) configure (2.0.6) core (2.0.6) cosmosdb (0.1.6) dla (0.0.6) dls (0.0.6) feedback (2.0.2) find (0.2.2) interactive (0.3.2) iot (0.1.5) keyvault (2.0.4) lab (0.0.4) monitor (0.0.4) network (2.0.6) nspkg (3.0.0) profile (2.0.4) rdbms (0.0.1) redis (0.2.3) resource (2.0.6) role (2.0.4) sf (1.0.1) sql (2.0.3) storage (2.0.6) vm (2.0.6)

OS Version: What OS and version are you using?
Answer here: Debian 8 (Jessie)

Shell Type: What shell are you using? (e.g. bash, cmd.exe, Bash on Windows)
Answer here: Bash

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Reactions:9
  • Comments:13 (5 by maintainers)

github_iconTop GitHub Comments

9reactions
limingucommented, May 16, 2019

This feature is nice to have. Close now as the low priority.

7reactions
samanghcommented, Jul 30, 2019

This seems useful, and we are open to contributions. Currently, this is a low priority as there is a clear workaround of piping to a file, and using the current upload command.

It’s worth noting that this is is not a good work around, because:

  • the amount data that can be passed over is limited by how much free space is available on the hard disk (which is not not the case when using pipes);
  • this causes significant I/O activity on the hard disk, which obviously slows things down.
Read more comments on GitHub >

github_iconTop Results From Across the Web

az storage blob | Microsoft Learn
Upload blobs or subdirectories to a storage blob directory. az storage ... Create a new Block Blob where the content of the blob...
Read more >
Automating Snowpipe for Microsoft Azure Blob Storage
Storage Blob Data Contributor grants read and write access. This allows loading data from or unloading data to files staged in the storage...
Read more >
Azure Storage Blob Rename - Stack Overflow
But the difference is that its copying everything on Azure side so you don't have to hold Memorystream on your server for Copy/Upload...
Read more >
Calculate & Validate MD5 hashes on Azure blob storage files ...
Now, let's go ahead and upload the same file to an Azure storage ... we know how to reencode the file hashes, let's...
Read more >
PicassoTiles PTT136 136pcs Tubular Pipes & Spout STEAM ...
Buy PicassoTiles PTT136 136pcs Tubular Pipes & Spout STEAM Interlocking Educational Building Block Set, Tube Locks, Pipeworks Construction Blocks w/ Storage ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found