Request: Accept full range of azure storage queue connection parameters
See original GitHub issueI’d like to test the azure storage queue transport locally using the Azure Storage emulator, Azurite [link], which allows you to host a version of the blob and queue storage service locally using a docker image running on localhost accessible via http.
Kombu should be able to connect to it; however, the connection url it accepts doesn’t make room for any other connection parameters than account_name
and access_key
, limiting you to e.g. the default endpoint (core.windows.net
) and transport scheme (https
).
Here is the full range accepted by QueueService
:
QueueService(account_name=None, account_key=None, sas_token=None, is_emulated=False, protocol=‘https’, endpoint_suffix=‘core.windows.net’, request_session=None, connection_string=None, socket_timeout=None)
Kombu just looks at the account_name
and account_key
based and uses the defaults for the rest:
https://kombu.readthedocs.io/en/latest/reference/kombu.transport.azurestoragequeues.html
At a minimum, by supporting connection_string
, you could use the QueueService
itself to parse everything for you. After a brief look into the source, I wasn’t able to quickly determine where this change would need to be made - maybe in azurestoragequeue.py
or its base classvirtual.Channel
?
I’d be happy to make a PR if I knew where to look.
Issue Analytics
- State:
- Created 3 years ago
- Comments:6 (3 by maintainers)
Top GitHub Comments
After looking at this further, the kombu library seems to be working with the old version of the azure storage queue library. The newest version of the SDK (^12.0) has apparently refactored the
QueueService
class into theQueueServiceClient
class, which only supports authenticating via a connection string rather than account name/access key:link to python SDK docs
I have just tried out v5.3.0a1 using the updated
azurestoragequeues
transport and a local Azurite instance running in docker. The updated transport works as expected when the account url ishttp://localhost:10001/devstoreaccount1
. However, my celery worker fails to start when running inside a docker-compose configuration where the account url becomeshttp://azurite:10001/devstoreaccount1
, whereazurite
is the name of the Azurite container instance. This appears to be related to an issue with the Azure python SDK: https://github.com/Azure/azure-sdk-for-python/issues/19202. Might it be worth considering the suggested workaround in that issue thread?