The current design for PowerShell functions in an Azure Function App
See original GitHub issueI just completed creating my first Azure Function App, complete with five separate PowerShell functions (four triggered by HTTP/REST, one triggered when messages are added to an Azure Storage Queue), and while working on this I ran into several design issues that raised many questions about that design that I hope to get answered. For items that you agree are issues with Azure Function Apps, I will log separate issues so that they can be tracked and properly fixed. For any of these items that are not issues and that are simply due to a misunderstanding on my part, if you could highlight what I am missing I would really appreciate it.
PowerShell module support
When it comes to PowerShell modules, I noticed that the original Azure Function Apps design was to simply search for all .dll, .psd1 and .psm1 files in a modules subfolder under an Azure function, and that about 5 months ago that was extended to recursively search for .dll, .psd1 and .psm1 files. Both the original design and this extended design are seriously flawed, a significant step backwards when it comes to support for modules when within PowerShell, and should be reconsidered.
The current design has the following flaws:
- It strongly discourages the use of any modules other than your own because whatever module you use other than the most simple (single-file modules) will require manual modification, which will inevitably lead to bugs that may not be detected by the person making the modifications and that are difficult to find since you won’t have access to the native PowerShell debugger.
- It encourages duplication of code where such duplication does not need to exist, which will result in bugs that persist longer than they should if they are fixed in the original source code that was copied into the Azure Function App because of the module design flaws.
- It discourages module dependencies by unnecessarily preventing any module that takes a dependency from being able to load properly without specific scripting to make it work the way it should in native PowerShell.
- It completely ignores
$env:PSModulePath
and all of the value that brings to PowerShell modules. - It loses any and all notion of precedence among modules.
With the design as is today, it appears that you are reinventing the wheel with the thinking of a PowerShell 1.0 scripter. This design needs to be discarded in favor of a proper one.
Recommendation
Azure Function Apps should automatically update $env:PSModulePath
in the runspace used to invoke an Azure Function that is based on PowerShell such that it includes two additional folder paths: one for all functions within an Azure Function App, and one for each PowerShell function within an Azure Function App. The folder path specific to a PowerShell function should take precedence (be defined first in $env:PSModulePath
) over the folder path for all functions. This would allow for command overrides. The current design (to recursively search a modules folder for specific files) should be discarded in favor of this, so that Azure Function Apps properly leverage PowerShell’s built-in module discovery mechanism ($env:PSModulePath
).
There should also be a well-known folder for the shared modules location that already exists that users can copy modules into, at least using the Azure Storage Explorer, but eventually via the Azure Function Apps UI.
Added benefit
An added benefit to this design, which is well worth pursuing, is that you could have an “Add to Azure Function App” button next to modules in the PowerShell Gallery. This would be very similar to the “Add to Azure Automation” button that exists in the PowerShell Gallery today. It would allow you to install functions into Azure Function Apps, making them available for all PowerShell functions or only to a specific PowerShell function.
Workaround
You can work around this issue today by manually creating a modules folder with the Microsoft Azure Storage Explorer. I created one in the /data folder for my Azure Function App. Once that is created, place any modules you want to be able to load, including their dependencies, inside of that modules folder as is (don’t extract the contents, just copy over the root module folders). Once those are in place, add a function similar to the following to any PowerShell function in your Azure Function App where you want to use those modules, and manually load the module with Import-ModuleEx
:
function Import-ModuleEx {
[CmdletBinding()]
[OutputType([System.Void])]
param(
[Parameter(Position=0, Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[System.String]
$Name
)
try {
$oldPSModulePath = $env:PSModulePath
$modulePathEntries = @($env:PSModulePath -split ';')
$localModulesFolder = "${env:HOME}\data\Modules"
if ($modulePathEntries -notcontains $localModulesFolder) {
$env:PSModulePath += ";${localModulesFolder}"
}
Import-Module -Name $Name -Verbose
} finally {
$env:PSModulePath = $oldPSModulePath
}
}
Realistically, the modifications of $env:PSModulePath
here should not be necessary, nor should I have to manually invoke Import-Module
(nor my Import-ModuleEx
wrapper). I should simply be able to invoke a command and let PowerShell’s autoloading do the rest for me. That is how modules have worked since PowerShell 2.0 was released in 2009.
If this is not properly recognized as a design issue, I would appreciate someone explaining to me why the wheel needs to be reinvented here for Azure Function Apps.
Sharing information across functions
Within my Azure Function App, I have five separate PowerShell functions. Three of those functions use the same set of credentials to connect to a managed resource, and the same csv data to do something with that resource. The current interface allows me to upload files I have on disk, or add new files directly in the UI, however referencing those files seems to be a challenge, for the following reasons:
- The
$pwd
variable and the [Environment]::CurrentDirectory property both report the location as C:\Windows\System32. - The
$env:HOME
environment variable refers to a folder that is three levels higher than the folder where the run.ps1 file that defines your function body is located. - The
$PSScriptRoot
common variable is$null
in the scope of the function when it is executed because of how that function is executed (as a dynamically created and invoked script block instead of as a PowerShell file). - For the same reasons as the previous one,
$MyInvocation
cannot be used to identify the path to the script file either.
The only way it seems I can reference files I upload for a specific function is by using the entire path, which I figured out is ${env:HOME}
/site/wwwroot/MyAzureFunctionName/MyFileName.
To further complicate things, there does not seem to be a way to upload files such that they are accessible to multiple Azure functions from the Azure Function App console. Instead I need to choose a file location (which I did: I’m using /data/Files, right next to my /data/Modules folder), and then use the Azure Storage Explorer to upload the shared files I want to reference from multiple functions into that folder. Then, I can reference my files using the path ${env:HOME}
/data/Files/MyFileName.
Recommendation
This all needs to be much easier. There really should be:
- A variable that refers to the location of the current function that is being executed.
- The current folder should be set to the location of the function being executed so that relative paths can be used from there (e.g. ./MyFileName).
- A shared file location that is well-known, with a variable that refers to that location so that files in that folder can be easily referenced in functions within the Azure Function App.
Workaround
You can create these manually, using the folder paths I shared, or using your own, but the issue with this workaround is that it’s manual and it may differ from one Azure Function App to another.
Credential management
Azure Automation, and SMA before it, both have support for credential management in a secure fashion. Azure Function Apps need something similar so that credentials can be managed in a secure way and not visible via plain text viewing of files or scripts on a screen.
Recommendation
This one is pretty straight forward – you just need an encrypted store to put credentials into, and a PowerShell command that can be used to get them back out, unencrypted, so that they can be used in a function inside of an Azure Function App.
Workaround
Right now I’m storing my credentials unencrypted in separate json files so that they at least won’t appear on screen if I am demonstrating my functionality to someone else.
Verbose output
Some PowerShell commands have verbose output if requested, and you can use the Write-Verbose command to write something to the verbose stream. This is very useful when troubleshooting, and it distinguishes between output data and verbose messaging. There doesn’t seem to be a way to enable verbose output in Azure Functions, and worse, even if you manually set it in a script by setting $VerbosePreference = ‘Continue’, you don’t get any verbose messages in the logs or output stream.
Recommendation
Verbose output is important – there should be an option to turn on verbose output on each function without modifying that function’s code, and Azure Function apps should most certainly show verbose output when PowerShell is configured to show it.
Workaround
None
Warning output
Like the Verbose output issue, Warning output is ignored/discarded. This is a big limitation that should be fixed.
Recommendation
Support warning output – it’s there for a reason!
Workaround
None
Write-Error and non-terminating errors
Azure Function Apps strays from the default error handling behaviour that exists in PowerShell by treating all errors as terminating errors. This includes non-terminating errors. This behavioural change is concerning. Azure Function Apps that are used to run PowerShell are doing so in a way that differs from how PowerShell runs. I understand the need to block certain things for security reasons – that makes complete sense; however, I don’t feel changing the core PowerShell behaviour in Azure Function Apps is the right thing to do, because that makes every script, function, or module out there potentially hit or miss – if it runs in native PowerShell, it may not run in Azure Function Apps for the wrong reasons. Conversely, users who are less familiar with PowerShell and learn something about how PowerShell works in Azure Function Apps will eventually discover that PowerShell doesn’t work that way except in Azure Functions Apps.
Recommendation
Fix this so that Azure Function Apps use of PowerShell is consistent with how PowerShell itself works.
Workaround
None. You can only use Write-Output to capture text in a benign way in the logs, and Out-File to output data to the caller of your function.
Final comments
I believe All of these unnecessary differences should be pushed out of the product in favour of consistency with the core PowerShell language except where it must behave differently for security reasons.
Thanks for listening if you made it this far.
Issue Analytics
- State:
- Created 6 years ago
- Reactions:16
- Comments:5 (1 by maintainers)
Top GitHub Comments
@tohling My pleasure. I have one more important thought on this topic to share while it is top of mind.
Returning output from a Azure Function written in PowerShell
Right now it seems that Azure Functions written in PowerShell have to return data via the Out-File cmdlet, putting all output data into a file referenced by the
$res
variable, and that log messages are written by either using the Write-Output command or simply not capturing output. This is also a questionable design, for the following reasons:Recommendations
Workaround
I think I can get all of this (minus the PowerShell v5-specific content which requires a backend upgrade that I cannot do) working using proxy functions and by changing my Azure Functions to use a specific framework/format. It really should be designed from the ground up to work this way though.
+1 for Sharing information across functions. Being able to reference
$PSScriptRoot
would definitely be handy…