az functionapp deployment source config-zip requests/sets invalid SAS Uri in WEBSITE_RUN_FROM_PACKAGE
See original GitHub issue
az feedback
auto-generates most of the information requested below, as of CLI version azure-cli 2.3.1
Describe the bug When using “az functionapp deployment source config-zip” on a serverless app it sets a Blob Storage SAS Uri of the zip file in the WEBSITE_RUN_FROM_PACKAGE Functions App, Application Settings. The SAS Uri though is not valid setting the wrong Start Time using the local devices start time, but setting the timezone to GMT. This would only effect regions >GMT time - i.e. for me, Australia. Example of a url set:
<Error> <Code>AuthenticationFailed</Code> <Message> Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:2baed45f-a01e-000e-3dc5-14300d000000 Time:2020-04-17T14:37:46.3759129Z </Message> <AuthenticationErrorDetail> Signature not valid in the specified time frame: Start [Fri, 17 Apr 2020 23:56:15 GMT] - Expiry [Sat, 06 Apr 2030 00:06:15 GMT] - Current [Fri, 17 Apr 2020 14:37:46 GMT] </AuthenticationErrorDetail> </Error>
The start time set was 23:56:15 GMT and at the time, my local time was 23:56:15 but not GMT (I was in ACST). As a result the deployment created error:
Encountered an error (ServiceUnavailable) from host runtime."},{"CodSTRALIASOUTHEAST:20200417T144411Z:329b5457-710d-4113-9d06-n error (ServiceUnavailable) from host runtime."},{"Code":"BadRequest"},{"ErrorEntity":{"Code":"BadRequest","Message":"Encountered an error (ServiceUnavailable) from host runtime."}}],"Innererror":null} msrest.exceptions : Operation returned an invalid status code 'Bad Request' cli.azure.cli.core.util : Operation returned an invalid status code 'Bad Request' Operation returned an invalid status code 'Bad Request'
To Reproduce
- You need to run the script from a device located in a region where timezone is greater than GMT. Change your PC’s timezone to Australia as an example.
- Create a serverless python Functions App any region (originally I thought this mattered, but it doesnt).
- Setup a simple python project as per Python instructions
- Create a zip of python project
- Run: az functionapp deployment source config-zip … First time run, you will get an error ‘unknown error’ Subsequent times, you will get BadRequest – Encountered an error (ServiceUnavailable) from host runtime
- Navigate to the Application Settings page of the Functions App. Click pencil icon on the WEBSITE_RUN_FROM_PACKAGE setting. Copy content, open in browser - check first if getting error accessing zip, second check Start time of the SAS Uri
I suspect it is either using the deployment devices’ local time to set start time, but using GMT as timezone.
Expected behavior Start Time of SAS Uri should be set to the Devices Timezone (if using the devices local time), or otherwise generate the SAS Uri in the cloud using GMT timezone and the current GMT time as the start date (not the time of the local device)
Environment summary azure-cli 2.3.1 PowerShell Core 7
Issue Analytics
- State:
- Created 3 years ago
- Comments:8 (1 by maintainers)
Okay so have determined, it does not matter which region services sit in the cloud (originally I was using Australia East, but even moving services to East US I still get the issue. This proves it is using the local time of the PC running the az functionapp command, but not using the PCs timezone. I have updated my opening post above with this correction.
So, I tried to fork this to see if I could work up a patch, but it’s not even clear how to run the test suite 😦
The problem appears to be in
src\azure-cli\azure\cli\command_modules\appservice\custom.py:558
on the current dev head. The functionupload_zip_to_storage
is doingnow = datetime.datetime.now()
- which is a timezone-unaware datetime object - and passing that over togenerate_blob_shared_access_signature
- which wants to convert everything to UTC.Either that call should be to
datetime.datetime.utcnow()
, or atzinfo
needs to be passed in. Otherwise, it just does a naive conversion which means the SAS token is always wrong (by a minimum of 50 minutes).