Azure Functions Blob Trigger Dynamic Binding - c#

I need an Azure Functions blob trigger to trigger off a bucket that is given at runtime by an app setting. I read it is possible to do this:
[FunctionName("Process")]
public static async Task Process([BlobTrigger("%BucketName%/{name}", Connection = "AzureWebJobsStorage")] Stream avroBlobStream, string name, TraceWriter log)
{
}
This works locally if I have BucketName ONLY in the Values field in appsettings.json.
{
"IsEncrypted": false,
"Values": {
"BucketName": "capture-bucket",
}
}
If its not in Values field, this is the error:
[6/24/2019 5:52:15 PM] Function 'SomeClass.Process' failed indexing and will be disabled.
[6/24/2019 5:52:15 PM] No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g. ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. config.UseServiceBus(), config.UseTimers(), etc.).
I put into Azure App function a setting with just BucketName, but it gave me the same error. Could you suggest what the setting should be called or what I am doing wrong when actually in a real Azure environment? Should it be Values:BucketName? But I have never seen an example on Microsoft website with Values: as the prefix.

For your error, I have a test one situation is the Microsoft.Azure.WebJobs.Extensions.Storage package is not installed. After installation, it will work. You could have a try.
As for the dynamic bindings, the there is a description on the official tutorial: Binding expressions - app settings. And when you test locally, app setting values come from the local.settings.json file. I don't know why you are using appsettings.json. The format is just what you paste.
And on Azure, cause the settings in local.settings.json won't be deployed with VS, you have to go to you Azure Function Configuration, and set the the binding name.
I have test, this way it will work, it could process my blob file.
Hope this could help you, if I misunderstand you requirements please let me know.

Related

Disabling WebJob Functions Depending on Environment

My team recently inherited an ASP.NET solution that runs on Azure App Services. One project in the solution seems to define C# code that leverages the Azure WebJobs SDK to run several functions. I am not a C# or ASP.NET developer by trade, but I'm involved in developing the build and release pipelines for the project.
I have two App Service environments that need to run the WebJobs project, but in the case of one of those environments, some of the functions should not run.
Each function seems to have its own .cs file, and it seems these functions can inherit configuration from an App.config file (which can be transformed at runtime using files like App.Staging.config and App.Prod.config). An example of a function in the project might look like:
using Microsoft.Azure.WebJobs;
using System;
using System.Configuration;
using System.IO;
namespace My.Project.WebJobs
{
public class SomeTask
{
public void ExecuteTask([TimerTrigger(typeof(CustomScheduleDaily3AM))]TimerInfo timerInfo, TextWriter log)
{
var unitOfWork = new UnitOfWork();
var SomeSetting = int.Parse(ConfigurationManager.AppSettings["SomeSetting"]);
unitOfWork.Execute.Something();
}
}
}
With my limited understanding of my options, the first idea that occurred to me was to potentially add an enable/disable switch to the method(s) (i.e. SomeTask or ExecuteTask in the example above) that might read its true/false value from a setting defined in App.config. Though, not being well-versed in C#... I'm not confident this is possible. Doing it this way, the function may still run, but no action is taken on account of the method(s) being disabled.
I feel as if there may be a solution that relies more on Azure configuration as opposed to function-level code changes. Any help is appreciated.
After researching, I found three ways to meet your need.
Create multiple instance depends on different environment, and publish your WebJobs to different instance.
Use staging slots.
Use 'ASPNETCORE_ENVIRONMENT' appsettings. I would show how below:
Configure the ASPNETCORE_ENVIRONMENT on portal. (This setting was in launchSettings.json file in local machine)
Modify your code like this (just for showing how it works, more info see the doc):
if (_env.IsDevelopment())
{
Console.WriteLine(_env.EnvironmentName); //modify with your function
}
else if (_env.IsStaging())
{
Console.WriteLine(_env.EnvironmentName); //modify with your function
}
else
{
Console.WriteLine("Not dev or staging"); //modify with your function
}
Here is a reference about Use multiple Environments in ASPNET CORE,

Azure Functions RunOnStartUp set in configuration rather than at compile time?

I have an Azure timer triggered function scheduled to run every 3 months in production. However in test environment I'd like it to run on start up, every time it is triggered.
At the moment I have:
[TimerTrigger("%TimerInterval%", RunOnStartup = false)]
I don't really want to change the RunonStartup to true but wondered if there's a way of setting this in the configuration?
Is it possible to do something like:
RunOnStartup = "%RunOnStartUpBool%" and set that in appsettings?
Update 2022-03-30: My previous answer was updating your code to use an #IF Debug pre-processor directive as a way to switch into the RunOnStartup=true method parameter. As of 2022, you can bypass that ungainly workaround and just select an option in the VS Code Azure Functions extension! That seems less complex. There is more information here.
Another alternative would be logging into the Azure portal, navigating to your function app and using the function's Test/Run tab.
OLD ANSWER: There is a good SO question with multiple answers to this same question here.
My test environment is normally my local environment. So if we want to write code that ONLY runs on your local environment and not in production we could use a a preprocessor directive in the middle of the method signature that only sets RunOnStartup=true when you are in the debug mode.
public static void Run([TimerTrigger("%TimerInterval%"
#if DEBUG
,RunOnStartup=true // When debugging... run the job as soon as we press debug, no need to wait for the timer.
#endif
)]TimerInfo myTimer)
{
Explanation: During local development (debug) the #if DEBUG block is activated.
It enables the RunOnStartup=true parameter. In production (not debug) the #if DEBUG block is hidden.
Clearly not the prettiest code. But much better than the alternatives ... such as having to wait on the timer trigger interval during dev.
You can not set RunOnStartup dynamically from configuration/environment variable at runtime. But you can solve your problem in some other way since your purpose is to trigger it manually at startup (or anytime). You can manually trigger the function by some specialized http call as described below. You can do that let's say from your deployment pipeline of test environment as a post deployment step (or any other means you prefer).
To run a non HTTP-triggered function (like in this case it's timer triggered), you need a way to send a request to Azure to run the function. The URL used to make this request takes a specific form.
Host name: The function app's public location that is made up from the function app's name plus azurewebsites.net or your custom domain.
Folder path: To access non HTTP-triggered functions via an HTTP request, you have to send the request through the folders admin/functions.
Function name: The name of the function you want to run.
You use this request location along with the function's "master key" as x-functions-key header in the POST request to Azure to run the function. Note Content-Type header should be set as application/json
For details, refer Manually run a non HTTP-triggered function

Azure Functions - table storage binding not working

I have what seems to be a common problem with binding table storage in an azure function. I've read many if not all the answers on stack overflow such as
Why is Azure Function v2 unable to bind to CloudTable?
and as much as I can find on github and in ms documents. Despite trying all fixes, none have worked.
It is all the more confusing because I had it working.. when I returned to the project a couple of days later I started getting this error -
Microsoft.Azure.WebJobs.Host: Error indexing method 'GetCompetitions'. >Microsoft.Azure.WebJobs.Host: Cannot bind parameter 'competitionsTable' to type CloudTable. Make >sure the parameter Type is supported by the binding. If you're using binding extensions (e.g. Azure >Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the >extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), >builder.AddTimers(), etc.).
To get the function working locally I had to follow several steps once I created the function in Visual Studio, as follows:
install relevant nuget packages and add extensionBundle properties to host.json. Host file modified to:
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}
local.setting.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
}
}
Funtion:
[FunctionName("GetCompetitions")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
[Table("DailyFF", Connection = "AzureWebJobsStorage")] CloudTable competitionsTable,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
TableQuery<ActiveCompetition> projectionQuery = new TableQuery<ActiveCompetition>().Select(
new string[] { "RowKey", "Name" });
var activeCompetitions = await competitionsTable.ExecuteQuerySegmentedAsync(projectionQuery, null);
List<Competition> competitions = new List<Competition>();
foreach(var c in activeCompetitions.Results)
{
competitions.Add(new Competition
{
Id = c.RowKey,
Name = c.Name
});
}
return competitions != null
? (ActionResult)new OkObjectResult(competitions)
: new BadRequestObjectResult("Nothing doing, try something else.");
}
This worked. I successfully queried local table storage several times.
As I say when I returned to work on it it no longer worked with the error mentioned thrown.
A few things to note:
The function does not use any 3rd party libraries so there should be no conflicts.
All libraries use Windows.Azure.Storage 9.3.1 (as generated by visual studio)
I tried targeting.Net standard as suggested in many articles but it made no difference.
It's possible that I installed the asp.net core 3.x framework between writing the function initially and returning but all settings appear to be correct.
I'm stuck, any help/suggestions appreciated, thanks!
[edit]
Something I should add because it's looking like this is the at the root issue.. when I returned to the working function to continue development a new version of the Azure function CLI was installed automatically when I ran the project. I was a little surprised and I had to add a firewall exception as I had done previously.
I don't know which version was installed.
I've tried setting up a new profile and pointing to the latest version of the CLI downloaded from github (that has brought its own issues as I have to manually delete existing profiles from properties\launchSettings.json to get it to work). It doesn't fix the function binding issue however.
Seems to me there's a lot of unreliable "fixes" for this scenario so I'd greatly appreciate any links at all to a working demo of azure functions developed in visual studio 2017.
I'm a bit wary of using functions now I have to say. 2 days work and what was working is turning into something of a mare without even touching the working function.
so I figured it out but still not sure why it worked then stopped working without any changes..
I removed changes to host.json as it says extensionBundles aren't required if you install the pacakges with nuget as described here
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-register#vs
If you use Install-Package to reference a binding, you don't need to use extension
bundles. This approach is specific for class libraries built in Visual Studio.
I had read that but as the function worked I hadn't paid much attention to it. Removing
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
leaving the host.json file in it's initially generated state fixed the problem.
Just found this:
IQueryable isn't supported in the Functions v2 runtime. An alternative
is to use a CloudTable method parameter to read the table by using the
Azure Storage SDK. Here's an example of a 2.x function that queries an
Azure Functions log table:
If you try to bind to CloudTable and get an error message, make sure
that you have a reference to the correct Storage SDK version.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-table

Deployed app function has empty host.json and no host keys

I'm having a weird issue with my Azure App Function and I can't find anything on this.
I republished my function without changing its code, but suddenly the function stopped working and I'm getting this message as soon as I navigate to the function's page on Azure:
Error:
Error retrieving master key.
If I navigate to the function's settings, I can see that no keys have been generated and that host.json file is emptpy. Browsing my functions' files using Kudu, however, shows that file contents are correct.
Two more things make this weirder:
The function correctly works locally
If I take the code for another function and I deploy it on this one, my function works correctly, meaning that it's not an issue related to my function's configuration but rather to its code
Do you guys have any pointer on this?
EDIT:
Let me add more details on this.
Let's say I have 2 solutions, A.sln and B.sln.
I also have 2 App Functions on Azure, let's say F_1 and F_2.
A.sln and B.sln have the very same structure, the only difference is in business logic.
Same applies for F_1 and F_2, their only differences are the related storage accounts, as each function has its own.
Currently A.sln is deployed on F_1 and B.sln on F_2, and the only one working is F_1.
If I deploy A.sln on F_2, F_2 starts working, so my idea is that there's something wrong in B.sln's code because A.sln works with the very same configuration.
The Function App has a reference to a Storage account in application settings AzureWebJobsDashboard, AzureWebJobsStorage and WEBSITE_CONTENTAZUREFILECONNECTIONSTRING (if you are running on a consumption plan). Either clearing out this storage or simply recreating it fixed the problem.
I would also recommend creating separate storage accounts for every Function app - at least as long as these hard-to-find bugs are present. It is a lot easier to fix these kind of issues when they are only affecting a single Function app.
I don't know if this is the case here, but I found out that in my case (new deployment of Function App v3) host.json is empty on Azure, if there is a comment line in it. Removing comments solved my problem and host.json file is now deployed properly.
One of the reasons could be, the key inside the storage account might have been rotated. So, the connection strings referenced inside the AzureWebJobsDashboard and AzureWebJobsStorage of the azure function will be different.
Solution: Go to the storage account referenced in AzureWebJobsDashboard and AzureWebJobsStorage -> Access Keys -> Copy the connection string under key1 and use this for the AzureWebJobsDashboard and AzureWebJobsStorage.

Microsoft.SqlServer.Types not functioning properly on Azure Worker Role

I'm looking to use DbGeography via EntityFramework communicating with a SQL Server database, within an Azure Worker Role. As I understand it, DbGeography uses Microsoft.SqlServer.Types / SqlServerSpatial110.dll in the background, so in order to get it to work in Azure, I've followed:
http://blogs.msdn.com/b/adonet/archive/2013/12/09/microsoft-sqlserver-types-nuget-package-spatial-on-azure.aspx and installed the nuget package, then I've specified loading the SQL Server types in the OnStart method in WorkerRole.cs:
public override bool OnStart()
{
// Load SQL Server Types
SqlServerTypes.Utilities.LoadNativeAssemblies(AppDomain.CurrentDomain.BaseDirectory);
I also then followed this blogpost https://alastaira.wordpress.com/2011/08/19/spatial-applications-in-windows-azure-redux-including-denali/ where I explicitly added SqlServerSpatial110.dll to the project and set it to Copy always.
So the real issue is - immediately after deployment, everything works as expected. However, if I leave the Azure Worker role alone for a bit (~30 mins) and it receives no requests, the DbGeography portions of my code no longer function.
Is there something else I need to do?
Have I put the LoadNativeAssemblies call in the wrong place (should it be on Run() ?)
Do Azure Worker Roles get recycled?
Has anyone come across this before / managed to use DbGeography with an Azure Worker Role, or have any insight as to what might be happening?
Very late reply, but I found your question while looking for a solution. How to load SQL Server Types in Azure Functions.
The major issue is that the path that the code is running from is not the same as with other applications types. So you cannot use the suggested:
SqlServerTypes.Utilities.LoadNativeAssemblies(Server.MapPath("~/bin")); or
SqlServerTypes.Utilities.LoadNativeAssemblies(AppDomain.CurrentDomain.BaseDirectory);
Within your Azure Function Run params, add in ExecutionContext context which allows you to retrieve the working folder.
By navigating up one level you can use this path for loading the SQL Server Types:
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log, ExecutionContext context)
{
string path = context.FunctionDirectory;
string newPath = Path.GetFullPath(Path.Combine(path, #"..\"));
SqlServerTypes.Utilities.LoadNativeAssemblies(newPath);
I've since determined that my issue was with EntityFramework not fully loading entities, instead of anything to do with Microsoft.SqlServer.Types, but I'm leaving this question instead of deleting in just in case it will help someone.
I have no idea if this is the right thing to do, so moderators / smarter people, please feel free to edit or do whatever.

Categories

Resources