[storage] StorageClient not setting the `credential` correctly
See original GitHub issuewhen using TokenCredential. This leads to the Blob batch operations won’t work for the BlobClient
overload.
public async deleteBlobs(
blobClients: BlobClient[],
options?: BlobDeleteOptions
): Promise<BlobBatchDeleteBlobsResponse>;
protected constructor(url: string, pipeline: Pipeline) {
// URL should be encoded and only once, protocol layer shouldn't encode URL again
this.url = escapeURLPath(url);
this.accountName = getAccountNameFromUrl(url);
this.pipeline = pipeline;
this.storageClientContext = new StorageClientContext(
this.url,
pipeline.toServiceClientOptions()
);
this.isHttps = iEqual(getURLScheme(this.url) || "", "https");
this.credential = new AnonymousCredential();
for (const factory of this.pipeline.factories) {
if (
(isNode && factory instanceof StorageSharedKeyCredential) ||
factory instanceof AnonymousCredential ||
isTokenCredential(factory)
) {
this.credential = factory;
}
}
// Override protocol layer's default content-type
const storageClientContext = this.storageClientContext as any;
storageClientContext.requestContentType = undefined;
}
But factory is bearerTokenAuthenticationPolicy
in this case.
https://github.com/Azure/azure-sdk-for-js/blob/79b0fdf31b0a70d5384d5d0328e90314db04a411/sdk/storage/storage-blob/src/Pipeline.ts#L220
export function newPipeline(
credential?: StorageSharedKeyCredential | AnonymousCredential | TokenCredential,
pipelineOptions: StoragePipelineOptions = {}
): Pipeline {
if (credential === undefined) {
credential = new AnonymousCredential();
}
// Order is important. Closer to the API at the top & closer to the network at the bottom.
// The credential's policy factory must appear close to the wire so it can sign any
// changes made by other factories (like UniqueRequestIDPolicyFactory)
const telemetryPolicy = new TelemetryPolicyFactory(pipelineOptions.userAgentOptions);
const factories: RequestPolicyFactory[] = [
tracingPolicy({ userAgent: telemetryPolicy.telemetryString }),
keepAlivePolicy(pipelineOptions.keepAliveOptions),
telemetryPolicy,
generateClientRequestIdPolicy(),
new StorageBrowserPolicyFactory(),
deserializationPolicy(), // Default deserializationPolicy is provided by protocol layer
new StorageRetryPolicyFactory(pipelineOptions.retryOptions),
logPolicy({
logger: logger.info,
allowedHeaderNames: StorageBlobLoggingAllowedHeaderNames,
allowedQueryParameters: StorageBlobLoggingAllowedQueryParameters
})
];
if (isNode) {
// policies only available in Node.js runtime, not in browsers
factories.push(proxyPolicy(pipelineOptions.proxyOptions));
factories.push(disableResponseDecompressionPolicy());
}
factories.push(
isTokenCredential(credential)
? bearerTokenAuthenticationPolicy(credential, StorageOAuthScopes)
: credential
);
return new Pipeline(factories, pipelineOptions);
}
Issue Analytics
- State:
- Created 3 years ago
- Comments:8 (8 by maintainers)
Top Results From Across the Web
How to resolve error in Google Storage Client in Symfony?
The error you are getting because the credentials are not set correctly. If you want to use the JSON file, a way to...
Read more >Class StorageClient (4.1.0) | .NET client library | Google Cloud
Synchronously creates a StorageClient, using application default credentials if no credentials are specified. Parameters ...
Read more >Creating and Using Oracle ASM Credentials File
Review this information to create an Oracle ASM credentials file. An Oracle ASM Storage Client does not have Oracle ASM running on the...
Read more >Use the Azure Storage Emulator for development and testing ...
The Azure Storage Emulator may not start correctly if another storage emulator, such as Azurite, is running on the system.
Read more >Cloud Storage Go Reference
Signing a URL requires credentials authorized to sign a URL. To use the same authentication that was used when instantiating the Storage client, ......
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@xirzec Yes, the commit is only a minimum POC and can be improved if we decide #2 is the way to go.
Right. What we really need in blob batch client is re-using existing authentication request policy factory to create a new pipeline. In the future when we can afford break changes, and can use azure core v2 it will probably be easier to look up a factory from the existing pipeline and re-use it.
For now, exposing
credential
in a new interface seems the way to go?