question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Can't find --overwrite in az storage blob upload/upload-batch

See original GitHub issue

az feedback auto-generates most of the information requested below, as of CLI version 2.0.62

Describe the bug

I notice a breaking change in 2.34.1. However, I can’t see it in my machine. If I pass this parameter, it gives me an “unrecognized parameter” error.

[BREAKING CHANGE] az storage blob upload/upload-batch: Fix --overwrite that it no longer overwrite by default https://docs.microsoft.com/en-us/cli/azure/release-notes-azure-cli

PS C:\> az -v
azure-cli                         2.34.1

core                              2.34.1
telemetry                          1.0.6

Extensions:
image-copy-extension               0.2.4
quantum                            0.2.0

Dependencies:
msal                              1.16.0
azure-mgmt-resource               20.0.0

Python location 'C:\Program Files (x86)\Microsoft SDKs\Azure\CLI2\python.exe'
Extensions directory 'C:\Users\yfy\.azure\cliextensions'
Development extension sources:
    C:\Users\yfy\work\azure-cli-extensions

Python (Windows) 3.8.9 (tags/v3.8.9:a743f81, Apr  6 2021, 13:22:56) [MSC v.1928 32 bit (Intel)]

Legal docs and information: aka.ms/AzureCliLegal


Your CLI is up-to-date.

Please let us know how we are doing: https://aka.ms/azureclihats
and let us know if you're interested in trying out our newest features: https://aka.ms/CLIUXstudy
PS C:\> az storage blob upload-batch -h

Command
    az storage blob upload-batch : Upload files from a local directory to a blob container.

Arguments
    --destination -d             [Required] : The blob container where the files will be uploaded.
        The destination can be the container URL or the container name. When the destination is the
        container URL, the storage account name will be parsed from the URL.
    --source -s                  [Required] : The directory where the files to be uploaded are
                                              located.
    --auth-mode                             : The mode in which to run the command. "login" mode
                                              will directly use your login credentials for the
                                              authentication. The legacy "key" mode will attempt to
                                              query for an account key if no authentication
                                              parameters for the account are provided. Environment
                                              variable: AZURE_STORAGE_AUTH_MODE.  Allowed values:
                                              key, login.
    --destination-path                      : The destination path that will be prepended to the
                                              blob name.
    --dryrun                                : Show the summary of the operations to be taken instead
                                              of actually uploading the file(s).
    --lease-id                              : The active lease id for the blob.
    --max-connections                       : Maximum number of parallel connections to use when the
                                              blob size exceeds 64MB.  Default: 2.
    --metadata                              : Metadata in space-separated key=value pairs. This
                                              overwrites any existing metadata.
    --no-progress                           : Include this flag to disable progress reporting for
                                              the command.
    --pattern                               : The pattern used for globbing files or blobs in the
                                              source. The supported patterns are '*', '?', '[seq]',
                                              and '[!seq]'. For more information, please refer to
                                              https://docs.python.org/3.7/library/fnmatch.html.
        When you use '*' in --pattern, it will match any character including the the directory
        separator '/'.
    --socket-timeout                        : The socket timeout(secs), used by the service to
                                              regulate data flow.
    --timeout                               : Request timeout in seconds. Applies to each call to
                                              the service.
    --type -t                               : Defaults to 'page' for *.vhd files, or 'block'
                                              otherwise. The setting will override blob types for
                                              every file.  Allowed values: append, block, page.

Content Control Arguments
    --content-cache --content-cache-control : The cache control string.
    --content-disposition                   : Conveys additional information about how to process
                                              the response payload, and can also be used to attach
                                              additional metadata.
    --content-encoding                      : The content encoding type.
    --content-language                      : The content language.
    --content-md5                           : The content's MD5 hash.
    --content-type                          : The content MIME type.
    --maxsize-condition                     : The max length in bytes permitted for an append blob.
    --validate-content                      : Specifies that an MD5 hash shall be calculated for
                                              each chunk of the blob and verified by the service
                                              when the chunk has arrived.

Precondition Arguments
    --if-match                              : An ETag value, or the wildcard character (*). Specify
                                              this header to perform the operation only if the
                                              resource's ETag matches the value specified.
    --if-modified-since                     : Commence only if modified since supplied UTC datetime
                                              (Y-m-d'T'H:M'Z').
    --if-none-match                         : An ETag value, or the wildcard character (*).
        Specify this header to perform the operation only if the resource's ETag does not match the
        value specified. Specify the wildcard character (*) to perform the operation only if the
        resource does not exist, and fail the operation if it does exist.
    --if-unmodified-since                   : Commence only if unmodified since supplied UTC
                                              datetime (Y-m-d'T'H:M'Z').

Storage Account Arguments
    --account-key                           : Storage account key. Must be used in conjunction with
                                              storage account name. Environment variable:
                                              AZURE_STORAGE_KEY.
    --account-name                          : Storage account name. Related environment variable:
                                              AZURE_STORAGE_ACCOUNT. Must be used in conjunction
                                              with either storage account key or a SAS token. If
                                              neither are present, the command will try to query the
                                              storage account key using the authenticated Azure
                                              account. If a large number of storage commands are
                                              executed the API quota may be hit.
    --connection-string                     : Storage account connection string. Environment
                                              variable: AZURE_STORAGE_CONNECTION_STRING.
    --sas-token                             : A Shared Access Signature (SAS). Must be used in
                                              conjunction with storage account name. Environment
                                              variable: AZURE_STORAGE_SAS_TOKEN.

Global Arguments
    --debug                                 : Increase logging verbosity to show all debug logs.
    --help -h                               : Show this help message and exit.
    --only-show-errors                      : Only show errors, suppressing warnings.
    --output -o                             : Output format.  Allowed values: json, jsonc, none,
                                              table, tsv, yaml, yamlc.  Default: json.
    --query                                 : JMESPath query string. See http://jmespath.org/ for
                                              more information and examples.
    --subscription                          : Name or ID of subscription. You can configure the
                                              default subscription using `az account set -s
                                              NAME_OR_ID`.
    --verbose                               : Increase logging verbosity. Use --debug for full debug
                                              logs.

Examples
    Upload all files that end with .py unless blob exists and has been modified since given date.
        az storage blob upload-batch -d mycontainer --account-name mystorageaccount --account-key
        00000000 -s <path-to-directory> --pattern *.py --if-unmodified-since 2018-08-27T20:51Z


    Upload all files from local path directory to a container named "mycontainer".
        az storage blob upload-batch -d mycontainer -s <path-to-directory>


    Upload all files with the format 'cli-2018-xx-xx.txt' or 'cli-2019-xx-xx.txt' in local path
    directory.
        az storage blob upload-batch -d mycontainer -s <path-to-directory> --pattern
        cli-201[89]-??-??.txt


    Upload all files with the format 'cli-201x-xx-xx.txt' except cli-2018-xx-xx.txt' and
    'cli-2019-xx-xx.txt' in a container.
        az storage blob upload-batch -d mycontainer -s <path-to-directory> --pattern
        cli-201[!89]-??-??.txt


To search AI knowledge base for examples, use: az find "az storage blob upload-batch"

Please let us know how we are doing: https://aka.ms/azureclihats

To Reproduce

Expected behavior

Environment summary

MSI Windows 10

Additional context

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:11 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
calvinhzycommented, Jul 26, 2022

We are adding --overwrite in 2.39.0 without changing existing behavior, (for download overwrite by default, and for download-batch does not overwrite by default)

1reaction
calvinhzycommented, Mar 31, 2022

Was not able to reproduce it either with the same python version 3.8.9. Please try rerunning pip install azure-cli

Read more comments on GitHub >

github_iconTop Results From Across the Web

az storage blob | Microsoft Learn
Upload a file to a storage blob. az storage blob upload-batch. Upload files from a local directory to a blob container. az storage...
Read more >
Deploying a Static Site to Azure Using the az CLI
This post is about deploying a static sites to Azure storage account using Azure CLI and configuring GitHub actions to deploy the files....
Read more >
AzureBlob Upload ERROR:The specified blob already exists
If you want to overwrite the existing blob using Blob storage client library v12, just add overwrite=True in the upload_blob method.
Read more >
Bulk upload files to Azure Blob Storage with the Azure CLI
Fortunately, the Azure CLI now has the capability to bulk upload files with the az storage blob upload-batch command. You need to upload...
Read more >
Failure to Overwrite Files in Blob Storage
This issue began in March 2022 when Microsoft changed the default behavior of az storage blob upload on March 3, 2022 so that...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found