question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

The current design for PowerShell functions in an Azure Function App

See original GitHub issue

I just completed creating my first Azure Function App, complete with five separate PowerShell functions (four triggered by HTTP/REST, one triggered when messages are added to an Azure Storage Queue), and while working on this I ran into several design issues that raised many questions about that design that I hope to get answered. For items that you agree are issues with Azure Function Apps, I will log separate issues so that they can be tracked and properly fixed. For any of these items that are not issues and that are simply due to a misunderstanding on my part, if you could highlight what I am missing I would really appreciate it.

PowerShell module support

When it comes to PowerShell modules, I noticed that the original Azure Function Apps design was to simply search for all .dll, .psd1 and .psm1 files in a modules subfolder under an Azure function, and that about 5 months ago that was extended to recursively search for .dll, .psd1 and .psm1 files. Both the original design and this extended design are seriously flawed, a significant step backwards when it comes to support for modules when within PowerShell, and should be reconsidered.

The current design has the following flaws:

  1. It strongly discourages the use of any modules other than your own because whatever module you use other than the most simple (single-file modules) will require manual modification, which will inevitably lead to bugs that may not be detected by the person making the modifications and that are difficult to find since you won’t have access to the native PowerShell debugger.
  2. It encourages duplication of code where such duplication does not need to exist, which will result in bugs that persist longer than they should if they are fixed in the original source code that was copied into the Azure Function App because of the module design flaws.
  3. It discourages module dependencies by unnecessarily preventing any module that takes a dependency from being able to load properly without specific scripting to make it work the way it should in native PowerShell.
  4. It completely ignores $env:PSModulePath and all of the value that brings to PowerShell modules.
  5. It loses any and all notion of precedence among modules.

With the design as is today, it appears that you are reinventing the wheel with the thinking of a PowerShell 1.0 scripter. This design needs to be discarded in favor of a proper one.

Recommendation

Azure Function Apps should automatically update $env:PSModulePath in the runspace used to invoke an Azure Function that is based on PowerShell such that it includes two additional folder paths: one for all functions within an Azure Function App, and one for each PowerShell function within an Azure Function App. The folder path specific to a PowerShell function should take precedence (be defined first in $env:PSModulePath) over the folder path for all functions. This would allow for command overrides. The current design (to recursively search a modules folder for specific files) should be discarded in favor of this, so that Azure Function Apps properly leverage PowerShell’s built-in module discovery mechanism ($env:PSModulePath).

There should also be a well-known folder for the shared modules location that already exists that users can copy modules into, at least using the Azure Storage Explorer, but eventually via the Azure Function Apps UI.

Added benefit

An added benefit to this design, which is well worth pursuing, is that you could have an “Add to Azure Function App” button next to modules in the PowerShell Gallery. This would be very similar to the “Add to Azure Automation” button that exists in the PowerShell Gallery today. It would allow you to install functions into Azure Function Apps, making them available for all PowerShell functions or only to a specific PowerShell function.

Workaround

You can work around this issue today by manually creating a modules folder with the Microsoft Azure Storage Explorer. I created one in the /data folder for my Azure Function App. Once that is created, place any modules you want to be able to load, including their dependencies, inside of that modules folder as is (don’t extract the contents, just copy over the root module folders). Once those are in place, add a function similar to the following to any PowerShell function in your Azure Function App where you want to use those modules, and manually load the module with Import-ModuleEx:

function Import-ModuleEx {
    [CmdletBinding()]
    [OutputType([System.Void])]
    param(
        [Parameter(Position=0, Mandatory=$true)]
        [ValidateNotNullOrEmpty()]
        [System.String]
        $Name
    )
    try {
        $oldPSModulePath = $env:PSModulePath
        $modulePathEntries = @($env:PSModulePath -split ';')
        $localModulesFolder = "${env:HOME}\data\Modules"
        if ($modulePathEntries -notcontains $localModulesFolder) {
            $env:PSModulePath += ";${localModulesFolder}"
        }
        Import-Module -Name $Name -Verbose
    } finally {
        $env:PSModulePath = $oldPSModulePath
    }
}

Realistically, the modifications of $env:PSModulePath here should not be necessary, nor should I have to manually invoke Import-Module (nor my Import-ModuleEx wrapper). I should simply be able to invoke a command and let PowerShell’s autoloading do the rest for me. That is how modules have worked since PowerShell 2.0 was released in 2009.

If this is not properly recognized as a design issue, I would appreciate someone explaining to me why the wheel needs to be reinvented here for Azure Function Apps.

Sharing information across functions

Within my Azure Function App, I have five separate PowerShell functions. Three of those functions use the same set of credentials to connect to a managed resource, and the same csv data to do something with that resource. The current interface allows me to upload files I have on disk, or add new files directly in the UI, however referencing those files seems to be a challenge, for the following reasons:

  1. The $pwd variable and the [Environment]::CurrentDirectory property both report the location as C:\Windows\System32.
  2. The $env:HOME environment variable refers to a folder that is three levels higher than the folder where the run.ps1 file that defines your function body is located.
  3. The $PSScriptRoot common variable is $null in the scope of the function when it is executed because of how that function is executed (as a dynamically created and invoked script block instead of as a PowerShell file).
  4. For the same reasons as the previous one, $MyInvocation cannot be used to identify the path to the script file either.

The only way it seems I can reference files I upload for a specific function is by using the entire path, which I figured out is ${env:HOME}/site/wwwroot/MyAzureFunctionName/MyFileName.

To further complicate things, there does not seem to be a way to upload files such that they are accessible to multiple Azure functions from the Azure Function App console. Instead I need to choose a file location (which I did: I’m using /data/Files, right next to my /data/Modules folder), and then use the Azure Storage Explorer to upload the shared files I want to reference from multiple functions into that folder. Then, I can reference my files using the path ${env:HOME}/data/Files/MyFileName.

Recommendation

This all needs to be much easier. There really should be:

  1. A variable that refers to the location of the current function that is being executed.
  2. The current folder should be set to the location of the function being executed so that relative paths can be used from there (e.g. ./MyFileName).
  3. A shared file location that is well-known, with a variable that refers to that location so that files in that folder can be easily referenced in functions within the Azure Function App.

Workaround

You can create these manually, using the folder paths I shared, or using your own, but the issue with this workaround is that it’s manual and it may differ from one Azure Function App to another.

Credential management

Azure Automation, and SMA before it, both have support for credential management in a secure fashion. Azure Function Apps need something similar so that credentials can be managed in a secure way and not visible via plain text viewing of files or scripts on a screen.

Recommendation

This one is pretty straight forward – you just need an encrypted store to put credentials into, and a PowerShell command that can be used to get them back out, unencrypted, so that they can be used in a function inside of an Azure Function App.

Workaround

Right now I’m storing my credentials unencrypted in separate json files so that they at least won’t appear on screen if I am demonstrating my functionality to someone else.

Verbose output

Some PowerShell commands have verbose output if requested, and you can use the Write-Verbose command to write something to the verbose stream. This is very useful when troubleshooting, and it distinguishes between output data and verbose messaging. There doesn’t seem to be a way to enable verbose output in Azure Functions, and worse, even if you manually set it in a script by setting $VerbosePreference = ‘Continue’, you don’t get any verbose messages in the logs or output stream.

Recommendation

Verbose output is important – there should be an option to turn on verbose output on each function without modifying that function’s code, and Azure Function apps should most certainly show verbose output when PowerShell is configured to show it.

Workaround

None

Warning output

Like the Verbose output issue, Warning output is ignored/discarded. This is a big limitation that should be fixed.

Recommendation

Support warning output – it’s there for a reason!

Workaround

None

Write-Error and non-terminating errors

Azure Function Apps strays from the default error handling behaviour that exists in PowerShell by treating all errors as terminating errors. This includes non-terminating errors. This behavioural change is concerning. Azure Function Apps that are used to run PowerShell are doing so in a way that differs from how PowerShell runs. I understand the need to block certain things for security reasons – that makes complete sense; however, I don’t feel changing the core PowerShell behaviour in Azure Function Apps is the right thing to do, because that makes every script, function, or module out there potentially hit or miss – if it runs in native PowerShell, it may not run in Azure Function Apps for the wrong reasons. Conversely, users who are less familiar with PowerShell and learn something about how PowerShell works in Azure Function Apps will eventually discover that PowerShell doesn’t work that way except in Azure Functions Apps.

Recommendation

Fix this so that Azure Function Apps use of PowerShell is consistent with how PowerShell itself works.

Workaround

None. You can only use Write-Output to capture text in a benign way in the logs, and Out-File to output data to the caller of your function.

Final comments

I believe All of these unnecessary differences should be pushed out of the product in favour of consistency with the core PowerShell language except where it must behave differently for security reasons.

Thanks for listening if you made it this far.

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Reactions:16
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

4reactions
KirkMunrocommented, Jun 5, 2017

@tohling My pleasure. I have one more important thought on this topic to share while it is top of mind.

Returning output from a Azure Function written in PowerShell

Right now it seems that Azure Functions written in PowerShell have to return data via the Out-File cmdlet, putting all output data into a file referenced by the $res variable, and that log messages are written by either using the Write-Output command or simply not capturing output. This is also a questionable design, for the following reasons:

  1. It is backwards from how PowerShell is used in practice – normally any output from a script or function is the returned output, and messaging around that output are generated using streams (progress, warning, error, verbose, debug, or information – information is v5+ only) or using Write-Host (which goes to the host/console).
  2. It isn’t intuitive that you need to write your output to a file to return it to the caller.
  3. It is very limiting because the caller cannot get any streamed data from outside of the Azure Function App UI. The caller can only get output.

Recommendations

  1. Return the output from a function as the output that is sent to the output sources (the caller if invoked via HTTP, or to an Azure Storage Message Queue), and remove the need to invoke Out-File.
  2. For HTTP/REST API calls, provide a built-in query parameter (“streamOutput”?) that takes a true/false (non-zero/zero) value that can be used to indicate you want streamed output back from the call you invoke (this would be quite useful when debugging/troubleshooting, and it would also allow for a client-side module (could be added to AzureRm, but I could also write one easily enough and share it as an open source community project) to be written that produce that output directly in PowerShell just as if you were running the command locally, including progress bars, etc. The output sent back to the caller would then be a json structure that at the top level represents a collection of messages, with the stream identifier, any additional relevant stream details (like Write-Progress needs or like colors used in Write-Host), and records containing object data would indicate the stream was stdout, with the object data nested underneath it. Using streamOutput should also automatically support invoking the REST API with pages so that you can actually get streaming working with a client side invocation.
  3. Implement Write-Host. It’s not a “bad command” anymore and may be used in modules that would be very useful to Azure Functions.
  4. Upgrade the backend to PowerShell 5.1+ (you really need this regardless of this issue).
  5. Use Write-Information for log messages. I have an excellent email from Lee Holmes (PowerShell Team) that describes how Write-Information can be used for logging purposes by leveraging the Tags parameter, and that would work very well here. You could automatically log any Write-Information messages that are tagged with AzureFunctionLog. That tag is pretty unique, so you’re not likely to get a conflict from Azure Functions that invoke modules, so other Information stream content would go to the Information stream (and be capturable on invocation via HTTP if you use the streamOutput option).

Workaround

I think I can get all of this (minus the PowerShell v5-specific content which requires a backend upgrade that I cannot do) working using proxy functions and by changing my Azure Functions to use a specific framework/format. It really should be designed from the ground up to work this way though.

0reactions
frankfuucommented, Dec 18, 2017

+1 for Sharing information across functions. Being able to reference $PSScriptRoot would definitely be handy…

Read more comments on GitHub >

github_iconTop Results From Across the Web

PowerShell developer reference for Azure Functions
Understand how to develop functions by using PowerShell.
Read more >
Create a PowerShell function using Visual Studio Code
Create the function app in Azure ; Enter a globally unique name for the function app, Type a name that is valid in...
Read more >
Create function app resources in Azure using PowerShell
The Azure PowerShell example scripts in this article create function apps and other resources required to host your functions in Azure.
Read more >
Create a PowerShell function from the command line
Learn how to create a PowerShell function from the command line, then publish the local project to serverless hosting in Azure Functions.
Read more >
New-AzFunctionApp (Az.Functions)
This command creates a PowerShell function app which will be hosted in a service plan. Example 3: Create a function app using a...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found