How to Securely Call a Logic App from an Azure Function + Benefits
Last updated on October 16, 2021
Probably the most predominant reason to choose to implement something with Azure Logic Apps or Microsoft Power Automate is how fast and easy it is to automate processes with them, and how that can lead to reduced implementation costs. Regardless of Azure Logic Apps costing money per executed action, and Power Automate flow runs starting to cost money after the monthly run limit has been exceeded, it can still be much cheaper to build and run them for many years, than to code an Azure Function to do the same thing, even though it is practically free to run. However, there is one thing that can cause unexpected problems when using Logic Apps and Power Automate, and that is polling.
As I mentioned earlier, running an Azure Logic App costs money per every executed action. This also applies to triggers. Every time you poll any information source with a Logic App, you are spending money, even if there is nothing to process. This can make a big difference if you need to poll frequently.
In the case of Power Automate, you don’t need to pay extra for polling. However, there can be a long delay before the process starts, and there is no way control the polling frequency the same way as you can for Azure Logic Apps. This can be a problem if your process needs to get triggered almost immediately when new data becomes available.
OMG! So what should I do?
One way to get around this issue is by using an Azure Function to do the polling. The Azure function can then trigger the Logic App or the Power Automate flow only when there is data to process. By doing this, you can potentially save quite a bit of money and only pay for the actual processing of data.
So how do you know when you should use an Azure function to do the polling? After all, using the built-in polling triggers of Logic Apps and Power Automate is just so fast and easy. Setting up an Azure function requires a bit more effort. To find out, ask yourself these questions:
- How often do you need to poll? And how many sources of information you need to poll? Calculate how many times in total polling will occur in a month.
- How much information do you expect to receive in a month? In other words, how many of those polling runs will actually end up processing information? And on the contrary, how many times will you be polling for nothing? The latter value is the one that matters. Check the pricing information for how much this useless polling will cost you in a month.
- Is the polling cost per month acceptable by the customer? How about the polling costs per year? Or how much would the polling cost in total during the whole estimated lifespan of the application?
- How much time do you estimate it would take you to implement an Azure function to do the polling instead? You know your hourly rate; how much would it cost to the customer?
- Do you save so much time just by using one of the built-in triggers that it evens out the polling costs in the long term? Or would implementing the Azure function still end up saving a hefty sum of money? If your answer to this very last question is yes, then polling with an Azure function would be the better option.
The Plan
In the example of this blog post, I have a Logic App, and I’ll be polling an Azure Queue Storage. Instead of configuring the Logic App trigger When there are messages in a queue to poll my storage queue every few seconds, I’ll have an Azure function that will get triggered automatically as soon as (and only when) a message appears in the queue. That message is then forwarded to the logic app by using an HTTP request.
To be able to start a Logic App (or a Power Automate flow) from an Azure function with an HTTP request, the logic app needs to use the When an HTTP request is received trigger. After you’ve saved your Logic App at least once, the HTTP POST URL field contains the URL where we need to send our request to start our Logic App.
The URL has the following format:
https://prod-55.westeurope.logic.azure.com:443/workflows/0007c53.../triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=p3mfrJd9mrphIf4ueNYrz8GnaEvBF110fsnOcT4Nu94
All Logic App (and Power Automate) URLs have a Shared Access Signature (SAS) token for added security. The SAS token sig parameter is used for authorizing the caller to use the Logic App. Often people just add the URL with its complete SAS token to their source code — and from there again to the version control — and don’t think much of it. But because SAS token signatures are sensitive information, shouldn’t we treat them with the same care as we treat our passwords, and store them in Azure Key Vault whenever possible?
Let’s add the Logic App URL to the Azure function application settings, but instead of including the SAS token signature in it, we store it in Azure Key Vault. In our code, we can fetch it from there using the Managed Service Identity (MSI) of our Azure function and then put together the complete URL for the request during runtime. The signature is safer in the key vault, and if it is ever compromised, a new one can be generated for the Logic App and easily updated to the vault.
Creating the Azure function
You can build the Azure function and publish it to Azure straight from Visual Studio. You can also develop and publish Azure functions with Visual Studio Code if that is your preferred editor. However, these steps are for Visual Studio IDE.
- Create a new Azure Functions project in Visual Studio. You should be able to find it under the Cloud category. If you can’t see the option, install the Azure development workload for your Visual Studio via the Visual Studio Installer.
- In the next dialog, select how you want to trigger your Azure function. For my Azure function, I’m selecting the Queue trigger.
- In the Storage Account drop-down, select Browse…, and either pick an existing storage account from your Azure subscription or create a new one.
- Finally, fill in the other trigger-specific information (e.g., the queue name), and press OK.
To utilize Azure Key Vault and to authenticate to it using MSI, install the following NuGet packages for your project:
- Microsoft.Azure.KeyVault
- Microsoft.Azure.Services.AppAuthentication
If you are not using the queue trigger, you probably don’t want to copy all of the code below as is. Instead, just take the bits that you need.
The code below basically does two things: it forms the Logic App URL and then posts the queue message content (JSON) to it to start the Logic App. The base URL is fetched from the Azure function application settings, and the SAS token signature is fetched from the Azure key vault. The signature is stored as a secret in the vault, and to access it, we use the Azure function Managed Service Identity to authenticate to the vault. Then we fetch the signature using the secret URL we also get from the Azure function app settings. When the base URL and the complete SAS token have been combined, we use the complete URL to make a POST request to our Logic App using the HttpClient object. The request starts our Logic App and the Azure function code execution ends.
using System; using System.Net.Http; using System.Text; using System.Threading.Tasks; using Microsoft.Azure.KeyVault; using Microsoft.Azure.Services.AppAuthentication; using Microsoft.Azure.WebJobs; using Microsoft.Extensions.Logging; namespace SendQueueMessageToLogicApp { public static class SendQueueMessageToLogicApp { [FunctionName("SendQueueMessageToLogicApp")] public static void Run([QueueTrigger("queueName", Connection = "")]string myQueueItem, ILogger log) //Connection is empty because the Azure function and storage queue will be in the same storage account { try { var logicAppUrl = string.Format(GetAppSetting("LogicAppBaseUrl"), GetKeyVaultSecret("LogicAppSasToken")); PostMessage(logicAppUrl, myQueueItem); } catch (Exception e) { log.Log(LogLevel.Error, $"{e.Message} {e.StackTrace}"); throw; // The application needs to fail for the message to be retried or to end up in poison queue } } public static void PostMessage(string url, string body) { using (var httpClient = new HttpClient()) { var content = new StringContent(body, Encoding.UTF8, "application/json"); var response = httpClient.PostAsync(url, content).Result; } } public static string GetAppSetting(string key) { return Environment.GetEnvironmentVariable(key, EnvironmentVariableTarget.Process); } public static string GetKeyVaultSecret(string secretName) { var tokenProvider = new AzureServiceTokenProvider(); var vaultClient = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(tokenProvider.KeyVaultTokenCallback)); var secretUrl = string.Format(GetAppSetting("KeyVaultSecretUrl"), secretName); return Task.Run(async () => await vaultClient.GetSecretAsync(secretUrl)).Result.Value; } } }
Now that your Azure function is ready, you can publish it to Azure, preferably via an Azure DevOps pipeline.
Deploying additional resources
Creating a new Azure Functions App resource in Azure automatically creates a new storage account as well (that’s where the function files are located). However, the storage queue used by our queue trigger doesn’t get deployed automatically even though we specified the queue name when creating the Azure Functions project (it was just used for generating the Run method).
If you want to poll a storage queue as I do, you can create the queue in the same storage account that is used by your functions app:
- In Azure portal, go to the storage account that is also used by your Azure function.
- Click on Queues
- Click + Queues and give it the same name you gave for the queue when creating the Azure function. If you don’t remember it anymore, you can check it from the function code (Run method).
Configuring the Managed Service Identity and Azure Key Vault
To enable MSI for our function app, follow these steps:
- Go to your function app in Azure portal and click on the Platform features tab.
- On the tab, you should see an option called Managed service identity. Click on it and ensure that it is turned On.
Now that the MSI is enabled for your function app, the app can be granted permissions to the Azure Key Vault we’ll create.
- In Azure portal, click on Create a resource.
- Search for Key Vault and create one.
- In the access policies section, click on Add new and then Select principal. Search for your function app and select it. In the Secret permissions drop-down, select Get. Press OK and Create.
After the Azure Key Vault has been deployed…
- Go to its Secrets and add a new one.
- Name the secret LogicAppSasToken and put the sig parameter value from the logic app URL as its value.
If the SAS token signature is ever compromised, you can generate a new one in the Access keys section of your Logic app. After that, you need to get the new signature from the trigger URL, and update the new value to the Azure key vault secret.
To generate a new signature for a Power Automate flow, you need to create a copy of your flow by using the Save As option behind the three dots. This creates a new version of your flow with a new URL in the trigger. In addition to updating the SAS token signature to the key vault, you also need to update the rest of the URL to the application settings of your Azure function, because your new flow has a different ID in the URL.
Azure function application setting configurations
On the Key Vault overview page, you’ll see its DNS name. That combined with /secrets/{0} is the KeyVaultSecretUrl value we need to include in the function app Application settings section.
Another value we need to include in the application settings is the logic app base URL which you get from the logic app trigger. At the end of it, replace the sigvalue with {0}.
In both of these URLs, the {0} marks the place where the SAS token signature or the secret name will be inserted during runtime.
APP SETTING | VALUE |
---|---|
KeyVaultSecretUrl | https://keyvaultname.vault.azure.net/secrets/{0} |
LogicAppBaseUrl | https://prod-55.westeurope.logic.azure.com:443/workflows/<guid>/triggers/manual/paths/invoke?api-version=2016-10-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig={0} |
Now you should be all set!
- Add a reference to the secret stored in an Azure key vault to the resource configuration application settings in the following format:
@Microsoft.KeyVault(SecretUri=https://<vault-name>.vault.azure.net/secrets/<secret-name>/<secret-version-id>)
- Get the application setting value in your code the same way as you would normally get any other app setting value.
Note that the key vault reference always requires the secret version ID. If you regularly update the secret in the vault (the action creates a new version with a different ID), using this method requires you to update the application setting as well. However, using the code above does not have this issue: it always gets the latest version of the secret automatically.
Afterword
Another scenario in which you might want to call a Logic App or a Power Automate flow from an Azure function is if you have a long-running process because Azure functions are limited to 5-10 minutes of run time. Can you think of other scenarios in which it can be useful to call a Logic App or a Power Automate flow from an Azure function? Let me know in the comments; I’m very interested to hear your thoughts! Also, if you don’t yet follow me on Twitter, feel free to do so, and I’ll notify you whenever I publish new articles and if there are other interesting happenings or news I think you should know about. Thank for reading and until next time!
Laura
Hi Laura,
Stumbling on this old post. For other people’s sake, I believe it is also possible to reference the secret without a version :
@Microsoft.KeyVault(SecretUri=https://.vault.azure.net/secrets//)
Cheers!
Thanks for the heads up, Francois! I’ll add this post to my backlog of “posts to update.”
Laura
Hi Laura,
Hope you are doing good. I liked your post explaining Logic app , function app and Queue. Would be good if you could provide me few more samples on Queue, function app and logic app. I am interested to build workflow in azure with multiple steps and will execute some business rules based on trigger. If you could share some more documents that would be really good.
My scenario is , we will receive file as an input and need to process that file in different stage like check data, calculate pricing, match pricing and send refused contract details. Record count could be more than 5000. I want to build workflow and liked your idea. I will wait for your comment…
Thanks.
Hi Saket,
Thanks for the idea! If I ever end up building something similar to what you are describing, I’ll definitely consider blogging about it. Until then, google is your friend. 🙂
Laura