I have an Azure Function that is hooked up to a Service Bus queue. It receives a message and generates records to save in an Azure Table storage.
The Service Bus queue currently has a lot of messages:
Since there are a lot of pending messages, I would expect the scaling of the Azure Functions to happen, however it does not seem to be the case:
There is only one instance of the Function running and I expected more to help empty the queue.
As specified in the Azure Function's documentation on scaling when using a service bus, I made sure the policy used by the Azure Function included the Manage rights to help scaling:
Question
Why is my Azure Function running on a Consumption Plan not scaling to help dequeue all the messages from the Service Bus?
Update
Here is an extract of the Azure Function:
public static class RecordImporter
{
private static readonly IImporter Importer;
static RecordImporter()
{
Importer = new AzureTableImporter();
}
[FunctionName("Importer")]
public static async Task Run([ServiceBusTrigger("records ", Connection = "serviceBusConnectionString")] string serializedRecords, ILogger log)
{
var records = JsonConvert.DeserializeObject<List<Record>>(serializedRecords);
await Importer.AddRecordsAsync(records);
}
}
Just an idea, because some people face the similar problem with service bus trigger scale out:
https://github.com/microsoft/azure-pipelines-tasks/issues/6759
I notice you are using C#, so please do this when publish:
(Clear the Run from package file check box. By default, VS uses Zip deployment, which can be changed to Web deployment through the above steps.)
Related
We have an Azure function that is supposed to run as soon as a file is inserted into one of our Azure Storage blobs. We are seeing that it actually takes anywhere from 1-10 minutes to run after the file appears in the storage blob. We can't confirm for sure, but it appears that it's polling the storage blob every 10 minutes looking for changes, instead of running instantly upon insert.
Here is the code for the Trigger; the order-requests blob is the one where the file gets inserted:
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using Microsoft.WindowsAzure.Storage.Blob;
namespace Integration
{
public static class IntegrationFunction
{
[FunctionName("AbcIntegration")]
public static async Task Run(
[BlobTrigger("order-requests/{name}", Connection = "BlobStorageConString")] CloudBlockBlob blob,
[Blob("order-responses/{name}")] CloudBlockBlob outputBlob,
ILogger log)
{
var result = await new IntegrationService().IntegrateTask(blob, outputBlob);
log.LogInformation(result);
}
}
}
How can we ensure the function runs the instant the file hits the blob?
After doing some research, there are two options to address this:
Use Event Grid triggering instead of Blob triggering: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-grid?tabs=csharp
Ensure the function app is on an App Service Plan (not a Consumption Plan) and make sure the App is set to Always On.
Implementing #2 is what worked for us. We were on a Consumption Plan, which can take up to 10 minutes to fire-off a trigger for a Function App. We switched the app to an App Service Plan, set it to Always On, and now we are getting immediate execution when a file hits the blob.
It's important to note that additional costs are involved with an Always On App Service Plan. The idea behind a consumption plan is to only pay for when the function is running. This comes at its own cost though of possible cold starts taking up to 10 minutes for executions to occur.
I have one web api method :-
[POST]
public void SaveData()
{
}
I want to implement in such a way that , if code inside SaveData() method is already getting executed , then SaveData method will not accept any other call.
i.e. I want to keep calls to the savedata method in a queue.
When one call will finish , other call will occure.
Note:- Calling SaveData() method is not within my scope. I can not decide to call this api method synchronously or asynchronously.
Only thing within my scope is writing code for SaveData Method. It will be called thorough external system.
I strongly suggest that you use a message queuing system in your application. mjwills provided two great message-broker software solutions in the comments section:
RabbitMQ: is open-source lightweight and easy to deploy on premises and in the cloud
Amazon Simple Queue Service (SQS): is a fully managed cloud based solution provided by AWS
There's an integration effort before you can use any of these solutions in your app, but afterwards it will be as simple as:
[POST]
public void SaveData()
{
var msg = new SaveDataMessage();
/* populate msg object... */
this.queueClient.Publish(msg);
}
This pseudo-code publishes a message to the queue. On the other end a queue subscriber will be receiving and processing theses messages sequentially, for instance:
public void OnMessage(SaveDataMessage msg)
{
/* process message... */
}
Additional benefits are i) the subscriber can run in an independent process, which means your API request can return immediately, and a background worker will take care of processing messages ii) this architecture will be compatible with load balanced APIs, as your app scales up.
It's definitely worth the effort of building this structure early on into your app. It will give you more flexibility and productivity in the long run.
I have written an article about this subject which contains more detailed, complementary information to this answer: Using queues to offload Web API
As part of a Microservice based solution that we are building we have a number of Azure Functions sitting in one Azure Function App. The functions Orchestrate numerous requests to different APIs some of which take a long time to complete. We added Application Insights to the functions to allow for some tracking of the requests made, but dependency tracking is not working yet in Azure Functions. It is possible to manually track dependencies but that involves inserting some tracking code around each dependency call, however we want to avoid manually tracking dependencies on each and every call.
One of the solutions I have thought of would be to create a request tracker that tracks all outgoing web requests from the functions. Within the request tracker I could then track the dependency requests including their time. I want to hook the request tracker into some sort of web traffic handler, unfortunately I was unable to find much about doing this. A lot of posts mention using System.Net trace writer for this, but as far as I can see this requires a Web.config to setup and functions do not have one.
I have seen a few posts mentioning to create a request wrapper and place that on my outgoing requests, but unfortantely that is not an option as we use a number of packages that make requests internally. If you have any ideas that could get me going in the right direction please let me know. Thanks
Update:
I added the following helper method which allows me to manually track tasks as dependency requests
public static async Task<T> TrackDependency<T>(this Task<T> task, string dependecyName, string callName, string operationId)
{
var telemtryClient = new TelemetryClient();
var startTime = DateTime.UtcNow;
var timer = System.Diagnostics.Stopwatch.StartNew();
var success = true;
T result = default(T);
try
{
result = await task;
}
catch (Exception)
{
success = false;
}
finally
{
timer.Stop();
var dependencyTelemetry = new DependencyTelemetry(dependecyName, callName, startTime, timer.Elapsed, success);
dependencyTelemetry.Context.Operation.Id = operationId;
telemtryClient.Track(dependencyTelemetry);
}
return result;
}
It can then be used as follows:
client.Accounts.UpdateWithHttpMessagesAsync(accountId, account).TrackDependency("Accounts", "UpdateAccounts", requestContextProvider.CorrelationId);
I can now see individual request dependencies in Application Insights, but obviously the actual telemetry on them is very limited, it does not contain path info or much else.
So when you say dependency tracking is not working in Azure Functions, what exactly do you mean? Have you actually added and configured the Application Insights SDK to your actual function yet? The out-of-the-box monitoring experience with Azure Functions doesn't automatically add dependency tracing, but if you actually add/configure the Application Insights SDK in your function project it should start tracking everything going on in there.
I need to send a Push Notification once in a day, and I have created the API for the same. but I don't know how to run it from Azure Server.
I have read the below documentation but it is required some files to be uploaded but I want to run an API daily and once in a day.
Link: https://learn.microsoft.com/en-us/azure/app-service-web/web-sites-create-web-jobs
Do I have to create another project and upload in a WebJobs?
Thank You.
You can create a new project and create a webJob, but unless the Web Service you are deploying to has 'Always On' enabled, your app may be asleep when it should be starting.
Because of that I would actually recommend Azure Functions for this feature. under consumption plan you are only charged for the amount of time the logic runs and the execution count.
The function will look something like this in C# Scripting
using System;
public static async Task Run(TimerInfo myTimer, IAsyncCollector<Notification> notification, TraceWriter log)
{
log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
await notification.AddAsync(new Notification(){
// your code here
});
}
EDIT
instructions for Azure Notification Hub
I'm new to Azure WebJobs, I've run a sample where a user uploads an image to blob storage and inserts a record into the Queue, then the job retrieves that from the queue as a signal to do something like resizing the uploaded image. Basically in the code the job uses QueueTrigger attribute on a public static method to do all that.
Now I need a job that just does something like inserting a record into a database table every hour, it does not have any type of trigger, it just runs itself. How do I do this?
I tried to have a static method and in it I do the insert to db, the job did start but I got a message saying:
No functions found. Try making job classes public and methods public
static.
What am I missing?
Edit
After Victor's answer I tried the following,
static void Main()
{
JobHost host = new JobHost();
host.Call(typeof(Program).GetMethod("ManualTrigger"));
}
[NoAutomaticTrigger]
public static void ManualTrigger()
{
// insert records to db
}
but this time I got InvalidOperationException,
'Void ManualTrigger()' can't be invoked from Azure WebJobs SDK. Is it missing Azure WebJobs SDK attributes?
If you don't use any input/output attributes from the WebJobs SDK (QueueTrigger, Blob, Table, etc), you have to decorate the job with the NoAutomaticTrigger Attribute to be recognized by the SDK.
You could use the latest WebJobs SDK, which supports triggering job functions on schedule, based on the same CRON expression format.
You can use it to schedule your job every hour:
[Disable("DisableMyTimerJob")]
public static void TimerJob([TimerTrigger("00:01:00")] TimerInfo timerInfo, TextWriter log)
{
log.WriteLine("Scheduled job fired!");
}
Moreover, the WebJobs SDK also has a DisableAttribute that can be applied to functions, that allows you to enable/disable functions based on application settings. If you change the app setting in the Azure Management Portal, the job will be restarted (https://azure.microsoft.com/en-us/blog/extensible-triggers-and-binders-with-azure-webjobs-sdk-1-1-0-alpha1/).