I have what seems to be a common problem with binding table storage in an azure function. I've read many if not all the answers on stack overflow such as
Why is Azure Function v2 unable to bind to CloudTable?
and as much as I can find on github and in ms documents. Despite trying all fixes, none have worked.
It is all the more confusing because I had it working.. when I returned to the project a couple of days later I started getting this error -
Microsoft.Azure.WebJobs.Host: Error indexing method 'GetCompetitions'. >Microsoft.Azure.WebJobs.Host: Cannot bind parameter 'competitionsTable' to type CloudTable. Make >sure the parameter Type is supported by the binding. If you're using binding extensions (e.g. Azure >Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the >extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), >builder.AddTimers(), etc.).
To get the function working locally I had to follow several steps once I created the function in Visual Studio, as follows:
install relevant nuget packages and add extensionBundle properties to host.json. Host file modified to:
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}
local.setting.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
}
}
Funtion:
[FunctionName("GetCompetitions")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
[Table("DailyFF", Connection = "AzureWebJobsStorage")] CloudTable competitionsTable,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
TableQuery<ActiveCompetition> projectionQuery = new TableQuery<ActiveCompetition>().Select(
new string[] { "RowKey", "Name" });
var activeCompetitions = await competitionsTable.ExecuteQuerySegmentedAsync(projectionQuery, null);
List<Competition> competitions = new List<Competition>();
foreach(var c in activeCompetitions.Results)
{
competitions.Add(new Competition
{
Id = c.RowKey,
Name = c.Name
});
}
return competitions != null
? (ActionResult)new OkObjectResult(competitions)
: new BadRequestObjectResult("Nothing doing, try something else.");
}
This worked. I successfully queried local table storage several times.
As I say when I returned to work on it it no longer worked with the error mentioned thrown.
A few things to note:
The function does not use any 3rd party libraries so there should be no conflicts.
All libraries use Windows.Azure.Storage 9.3.1 (as generated by visual studio)
I tried targeting.Net standard as suggested in many articles but it made no difference.
It's possible that I installed the asp.net core 3.x framework between writing the function initially and returning but all settings appear to be correct.
I'm stuck, any help/suggestions appreciated, thanks!
[edit]
Something I should add because it's looking like this is the at the root issue.. when I returned to the working function to continue development a new version of the Azure function CLI was installed automatically when I ran the project. I was a little surprised and I had to add a firewall exception as I had done previously.
I don't know which version was installed.
I've tried setting up a new profile and pointing to the latest version of the CLI downloaded from github (that has brought its own issues as I have to manually delete existing profiles from properties\launchSettings.json to get it to work). It doesn't fix the function binding issue however.
Seems to me there's a lot of unreliable "fixes" for this scenario so I'd greatly appreciate any links at all to a working demo of azure functions developed in visual studio 2017.
I'm a bit wary of using functions now I have to say. 2 days work and what was working is turning into something of a mare without even touching the working function.
so I figured it out but still not sure why it worked then stopped working without any changes..
I removed changes to host.json as it says extensionBundles aren't required if you install the pacakges with nuget as described here
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-register#vs
If you use Install-Package to reference a binding, you don't need to use extension
bundles. This approach is specific for class libraries built in Visual Studio.
I had read that but as the function worked I hadn't paid much attention to it. Removing
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
leaving the host.json file in it's initially generated state fixed the problem.
Just found this:
IQueryable isn't supported in the Functions v2 runtime. An alternative
is to use a CloudTable method parameter to read the table by using the
Azure Storage SDK. Here's an example of a 2.x function that queries an
Azure Functions log table:
If you try to bind to CloudTable and get an error message, make sure
that you have a reference to the correct Storage SDK version.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-table
Related
I am trying to use Managed Identity with Azure Functions V3 and a QueueTrigger.
the function code is defined like this:
[Function("ProcessUserData")]
public async Task ProcessUserData([QueueTrigger("%QueueSettings:UserDataQueue%", Connection = "QueueSettings:StorageAccount")] string queueItem, FunctionContext context)
{
var logger = context.GetLogger<QueueListener>();
...
}
According to Microsoft documentation this should be possible by defining some additional configuration properties
<CONNECTION_NAME_PREFIX>__credential
<CONNECTION_NAME_PREFIX>__queueServiceUri
https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference?tabs=blob#local-development-with-identity-based-connections
My local.settings.json looks like this:
// "QueueSettings:StorageAccount": "",
"QueueSettings:StorageAccount__queueServiceUri": "https://mytestfa.queue.core.windows.net/",
"QueueSettings:StorageAccount__credential": "managedidentity",
When trying to run the project locally I get the following error:
[2021-12-06T18:07:53.181Z] The 'ProcessUserData' function is in error: Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.ProcessUserData'. Microsoft.Azure.WebJobs.Extensions.Storage: Storage account connection string 'AzureWebJobsQueueSettings:StorageAccount' does not exist. Make sure that it is a defined App Setting.
When I use and empty connection string I get another error:
"QueueSettings:StorageAccount": "",
"QueueSettings:StorageAccount__queueServiceUri": "https://mytestfa.queue.core.windows.net/",
"QueueSettings:StorageAccount__credential": "managedidentity",
Error:
[2021-12-06T18:25:20.262Z] The 'ProcessUserData' function is in error: Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.ProcessUserData'. Microsoft.Azure.WebJobs.Extensions.Storage: Storage account connection string for 'AzureWebJobsQueueSettings:StorageAccount' is invalid.
This works fine when using the full connection string with Account Key, but we have to be using managed identities.
I have upgraded to the latest version of Azure Functions Core Tole (3.0.3904) and am using Visual Studio 2022.
Additional documentation that this should work:
https://devblogs.microsoft.com/azure-sdk/introducing-the-new-azure-function-extension-libraries-beta/
Thanks for any insights.
I was able to resolve this by installing the 5.0.0-beta.4 version of the NuGet package "Microsoft.Azure.Functions.Worker.Extensions.Storage".
Now Managed Identify functionality is working as expected.
Hopefully this will go to GA soon.
My team recently inherited an ASP.NET solution that runs on Azure App Services. One project in the solution seems to define C# code that leverages the Azure WebJobs SDK to run several functions. I am not a C# or ASP.NET developer by trade, but I'm involved in developing the build and release pipelines for the project.
I have two App Service environments that need to run the WebJobs project, but in the case of one of those environments, some of the functions should not run.
Each function seems to have its own .cs file, and it seems these functions can inherit configuration from an App.config file (which can be transformed at runtime using files like App.Staging.config and App.Prod.config). An example of a function in the project might look like:
using Microsoft.Azure.WebJobs;
using System;
using System.Configuration;
using System.IO;
namespace My.Project.WebJobs
{
public class SomeTask
{
public void ExecuteTask([TimerTrigger(typeof(CustomScheduleDaily3AM))]TimerInfo timerInfo, TextWriter log)
{
var unitOfWork = new UnitOfWork();
var SomeSetting = int.Parse(ConfigurationManager.AppSettings["SomeSetting"]);
unitOfWork.Execute.Something();
}
}
}
With my limited understanding of my options, the first idea that occurred to me was to potentially add an enable/disable switch to the method(s) (i.e. SomeTask or ExecuteTask in the example above) that might read its true/false value from a setting defined in App.config. Though, not being well-versed in C#... I'm not confident this is possible. Doing it this way, the function may still run, but no action is taken on account of the method(s) being disabled.
I feel as if there may be a solution that relies more on Azure configuration as opposed to function-level code changes. Any help is appreciated.
After researching, I found three ways to meet your need.
Create multiple instance depends on different environment, and publish your WebJobs to different instance.
Use staging slots.
Use 'ASPNETCORE_ENVIRONMENT' appsettings. I would show how below:
Configure the ASPNETCORE_ENVIRONMENT on portal. (This setting was in launchSettings.json file in local machine)
Modify your code like this (just for showing how it works, more info see the doc):
if (_env.IsDevelopment())
{
Console.WriteLine(_env.EnvironmentName); //modify with your function
}
else if (_env.IsStaging())
{
Console.WriteLine(_env.EnvironmentName); //modify with your function
}
else
{
Console.WriteLine("Not dev or staging"); //modify with your function
}
Here is a reference about Use multiple Environments in ASPNET CORE,
Using .NET Core/Standard (C#) I'd like to retrieve details of all the nuget packages for a particular owner.
For example supplying "charliepoole" would give me all the related packages like on the page: https://www.nuget.org/profiles/charliepoole
I'm currently using package NuGet.ProjectModel to retrieve individual packages by ID/Version but there does not seem to be any other related methods that can help me.
For reference this is how I'm retrieving an individual package by ID and version:
public async Task<IPackageSearchMetadata> GetPackageAsync(string id, string version, CancellationToken cancellationToken = default)
{
var packageSource = new PackageSource("https://api.nuget.org/v3/index.json", "NugetV3");
var providers = new List<Lazy<INuGetResourceProvider>>();
providers.AddRange(Repository.Provider.GetCoreV3());
var repo = new SourceRepository(packageSource, providers);
var packageMetadataResource = await repo.GetResourceAsync<PackageMetadataResource>(cancellationToken);
var packageIdentity = new PackageIdentity(id, NuGetVersion.Parse(version));
var context = new SourceCacheContext
{
NoCache = true,
DirectDownload = true
};
return await packageMetadataResource
.GetMetadataAsync(packageIdentity, context, new NullLogger(), cancellationToken);
}
Is this possible using any of the official supplied nuget packages that talk to the nuget API?
Is this possible using any of the official supplied nuget packages that talk to the nuget API?
Not really, no.
The NuGet client team calls the packages they publish on nuget.org the NuGet (Client) SDK. NuGet.Protocol is the package that implements the client side of NuGet's server HTTP API. The NuGet SDK doesn't implement all of the server API, just the parts used in Visual Studio.
Since your question is about searching, the search docs are relevant. Looking at the response schema, we can see that owners is one of the fields, but looking at the request parameters, there's no query to ask specifically about an owner.
So, you can try passing the nuget.org account name you're interested as a search keyword and hope that the search index has that information. When you get results back, you can check each package and ignore the ones that don't have the account you're looking for as a package owner.
Two other server API resources that have package metadata are the catalog (transaction log of all packages processed by the ingestion pipeline), and package metadata, but neither show owner information.
However, the NuGet server team's issue tracker is GitHub's NuGet/NuGetGallary repo, and I found an issue asking how to get the list of owners of a given package on nuget.org. They acknowledged it's a shortcoming of the protocol. You can upvote that issue to signal that it's important (there's only 1 upvote so far, so it seems this is not important to most customers). But they also replied with an unofficial, undocumented URL that contains the owner list for all packages. You can use this, but obviously consider the risk that it could theoretically change at any time. Having said that, the comment was posted 2 years ago and it still works.
I need an Azure Functions blob trigger to trigger off a bucket that is given at runtime by an app setting. I read it is possible to do this:
[FunctionName("Process")]
public static async Task Process([BlobTrigger("%BucketName%/{name}", Connection = "AzureWebJobsStorage")] Stream avroBlobStream, string name, TraceWriter log)
{
}
This works locally if I have BucketName ONLY in the Values field in appsettings.json.
{
"IsEncrypted": false,
"Values": {
"BucketName": "capture-bucket",
}
}
If its not in Values field, this is the error:
[6/24/2019 5:52:15 PM] Function 'SomeClass.Process' failed indexing and will be disabled.
[6/24/2019 5:52:15 PM] No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g. ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. config.UseServiceBus(), config.UseTimers(), etc.).
I put into Azure App function a setting with just BucketName, but it gave me the same error. Could you suggest what the setting should be called or what I am doing wrong when actually in a real Azure environment? Should it be Values:BucketName? But I have never seen an example on Microsoft website with Values: as the prefix.
For your error, I have a test one situation is the Microsoft.Azure.WebJobs.Extensions.Storage package is not installed. After installation, it will work. You could have a try.
As for the dynamic bindings, the there is a description on the official tutorial: Binding expressions - app settings. And when you test locally, app setting values come from the local.settings.json file. I don't know why you are using appsettings.json. The format is just what you paste.
And on Azure, cause the settings in local.settings.json won't be deployed with VS, you have to go to you Azure Function Configuration, and set the the binding name.
I have test, this way it will work, it could process my blob file.
Hope this could help you, if I misunderstand you requirements please let me know.
I'm looking to use DbGeography via EntityFramework communicating with a SQL Server database, within an Azure Worker Role. As I understand it, DbGeography uses Microsoft.SqlServer.Types / SqlServerSpatial110.dll in the background, so in order to get it to work in Azure, I've followed:
http://blogs.msdn.com/b/adonet/archive/2013/12/09/microsoft-sqlserver-types-nuget-package-spatial-on-azure.aspx and installed the nuget package, then I've specified loading the SQL Server types in the OnStart method in WorkerRole.cs:
public override bool OnStart()
{
// Load SQL Server Types
SqlServerTypes.Utilities.LoadNativeAssemblies(AppDomain.CurrentDomain.BaseDirectory);
I also then followed this blogpost https://alastaira.wordpress.com/2011/08/19/spatial-applications-in-windows-azure-redux-including-denali/ where I explicitly added SqlServerSpatial110.dll to the project and set it to Copy always.
So the real issue is - immediately after deployment, everything works as expected. However, if I leave the Azure Worker role alone for a bit (~30 mins) and it receives no requests, the DbGeography portions of my code no longer function.
Is there something else I need to do?
Have I put the LoadNativeAssemblies call in the wrong place (should it be on Run() ?)
Do Azure Worker Roles get recycled?
Has anyone come across this before / managed to use DbGeography with an Azure Worker Role, or have any insight as to what might be happening?
Very late reply, but I found your question while looking for a solution. How to load SQL Server Types in Azure Functions.
The major issue is that the path that the code is running from is not the same as with other applications types. So you cannot use the suggested:
SqlServerTypes.Utilities.LoadNativeAssemblies(Server.MapPath("~/bin")); or
SqlServerTypes.Utilities.LoadNativeAssemblies(AppDomain.CurrentDomain.BaseDirectory);
Within your Azure Function Run params, add in ExecutionContext context which allows you to retrieve the working folder.
By navigating up one level you can use this path for loading the SQL Server Types:
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log, ExecutionContext context)
{
string path = context.FunctionDirectory;
string newPath = Path.GetFullPath(Path.Combine(path, #"..\"));
SqlServerTypes.Utilities.LoadNativeAssemblies(newPath);
I've since determined that my issue was with EntityFramework not fully loading entities, instead of anything to do with Microsoft.SqlServer.Types, but I'm leaving this question instead of deleting in just in case it will help someone.
I have no idea if this is the right thing to do, so moderators / smarter people, please feel free to edit or do whatever.