Queuing for web api methods - c#

I have one web api method :-
[POST]
public void SaveData()
{
}
I want to implement in such a way that , if code inside SaveData() method is already getting executed , then SaveData method will not accept any other call.
i.e. I want to keep calls to the savedata method in a queue.
When one call will finish , other call will occure.
Note:- Calling SaveData() method is not within my scope. I can not decide to call this api method synchronously or asynchronously.
Only thing within my scope is writing code for SaveData Method. It will be called thorough external system.

I strongly suggest that you use a message queuing system in your application. mjwills provided two great message-broker software solutions in the comments section:
RabbitMQ: is open-source lightweight and easy to deploy on premises and in the cloud
Amazon Simple Queue Service (SQS): is a fully managed cloud based solution provided by AWS
There's an integration effort before you can use any of these solutions in your app, but afterwards it will be as simple as:
[POST]
public void SaveData()
{
var msg = new SaveDataMessage();
/* populate msg object... */
this.queueClient.Publish(msg);
}
This pseudo-code publishes a message to the queue. On the other end a queue subscriber will be receiving and processing theses messages sequentially, for instance:
public void OnMessage(SaveDataMessage msg)
{
/* process message... */
}
Additional benefits are i) the subscriber can run in an independent process, which means your API request can return immediately, and a background worker will take care of processing messages ii) this architecture will be compatible with load balanced APIs, as your app scales up.
It's definitely worth the effort of building this structure early on into your app. It will give you more flexibility and productivity in the long run.
I have written an article about this subject which contains more detailed, complementary information to this answer: Using queues to offload Web API

Related

Write an independent job which has no impact to the current system in .net core API

I'm working on a CRM application and my client wants to download some information within last 6 months. At the moment I have created an API endpoint which returns FileContentResult object and that will open a new tab in browser and automatically download an Excel file.
But this process is time consuming (since it has over 500K data) and users don't wait in the same page until the process is done. So, once an user change between pages I get issues and sometimes the application return timeout error since the API response is slow.
Now, I'm planning to enhance that same function/API endpoint by introducing some silent job. Which means once user click on "Download" button, process will start and it will send a message stating that "Your download process has been started. You will receive an email with the report within next 15minutes". In this way, users don't have to wait and they can do something else in the system.
Currently, I'm using async task and awaits until the job is done.
public async Task<FileContentResult> ExportData()
{
//...
//... process data and create excel file
//...
//...
return new FileContentResult(*some byte array*, "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet")
{
FileDownloadName = $"Data.xlsx"
};
}
and I'm calling this method by
await exportService.ExportData();
My concern is what are the things I should change here in order to avoid any impact on other processes and run this as a background job. Once I get the result, I will send an email with an attachment.
Please help me with your valuable ideas. Thanks in advance
what are the things I should change here in order to avoid any impact on other processes and run this as a background job. Once I get the result, I will send an email with an attachment.
You should use a basic distributed architecture, as I describe on my blog. Specifically:
Instead of creating the report in your ASP.NET app, your ASP.NET app should just create a message indicating that the report should be created, and place that message into a durable queue.
Have a separate, independent process read the messages from that queue, generate the report, and send the email.

Tracking outgoing requests in Azure Functions

As part of a Microservice based solution that we are building we have a number of Azure Functions sitting in one Azure Function App. The functions Orchestrate numerous requests to different APIs some of which take a long time to complete. We added Application Insights to the functions to allow for some tracking of the requests made, but dependency tracking is not working yet in Azure Functions. It is possible to manually track dependencies but that involves inserting some tracking code around each dependency call, however we want to avoid manually tracking dependencies on each and every call.
One of the solutions I have thought of would be to create a request tracker that tracks all outgoing web requests from the functions. Within the request tracker I could then track the dependency requests including their time. I want to hook the request tracker into some sort of web traffic handler, unfortunately I was unable to find much about doing this. A lot of posts mention using System.Net trace writer for this, but as far as I can see this requires a Web.config to setup and functions do not have one.
I have seen a few posts mentioning to create a request wrapper and place that on my outgoing requests, but unfortantely that is not an option as we use a number of packages that make requests internally. If you have any ideas that could get me going in the right direction please let me know. Thanks
Update:
I added the following helper method which allows me to manually track tasks as dependency requests
public static async Task<T> TrackDependency<T>(this Task<T> task, string dependecyName, string callName, string operationId)
{
var telemtryClient = new TelemetryClient();
var startTime = DateTime.UtcNow;
var timer = System.Diagnostics.Stopwatch.StartNew();
var success = true;
T result = default(T);
try
{
result = await task;
}
catch (Exception)
{
success = false;
}
finally
{
timer.Stop();
var dependencyTelemetry = new DependencyTelemetry(dependecyName, callName, startTime, timer.Elapsed, success);
dependencyTelemetry.Context.Operation.Id = operationId;
telemtryClient.Track(dependencyTelemetry);
}
return result;
}
It can then be used as follows:
client.Accounts.UpdateWithHttpMessagesAsync(accountId, account).TrackDependency("Accounts", "UpdateAccounts", requestContextProvider.CorrelationId);
I can now see individual request dependencies in Application Insights, but obviously the actual telemetry on them is very limited, it does not contain path info or much else.
So when you say dependency tracking is not working in Azure Functions, what exactly do you mean? Have you actually added and configured the Application Insights SDK to your actual function yet? The out-of-the-box monitoring experience with Azure Functions doesn't automatically add dependency tracing, but if you actually add/configure the Application Insights SDK in your function project it should start tracking everything going on in there.

Combining multiple services in a single Azure web role

I have a web application hosted on Azure, which spins up roles for a short period of time to perform a task. This task is necessary, so I can't do without it. I want to combine these services and assign them to a single role, so that when the service is run ten times in an hour, I don't get billed for ten hours of use.
I found a blog post and potential solution, created in 2012. However, this project is horribly out of date, uses MVC3, and uses packages that haven't been available for several years now. Trying to make it work is not feasible.
Is multiple services on a single role still a valid solution, or do I have to take another approach?
I found a blog post and potential solution, created in 2012. However, this project is horribly out of date, uses MVC3, and uses packages that haven't been available for several years now. Trying to make it work is not feasible.
Is multiple services on a single role still a valid solution, or do I have to take another approach?
I agree with David Makogon, Web / worker role stuff hasn't changed in years. If you’d like to include background processing into your web role, you could still create the web role and override the Run() method in class WebRole and implement worker tasks with your logic in Run() method. The following simple sample that add message to storage queue works fine on my side, please refer to it.
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
return base.OnStart();
}
public override void Run()
{
//Replace the code with your logic
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("my connection string");
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
CloudQueue queue = queueClient.GetQueueReference("mymessage");
while (true)
{
CloudQueueMessage message = new CloudQueueMessage("worker run at " + DateTime.UtcNow.ToString());
queue.AddMessage(message);
System.Threading.Thread.Sleep(60000);
}
}
}

Creating new AppDomain calling method in same class as the AppDomain in made

I want to start below potentially long running thread in it's own AppDomain to prevent the webserver from aborting it during recycling. It compiles fine, however during runtime I get this cryptic error
Type is not resolved for member 'MyCore.MyWebService,MyCore,
Version=5.0.0.0, Culture=neutral, PublicKeyToken=null'.
How do I find out what member is not resolved?
Are there any better ways running a long standing thread in a MVC business service layer, that does not get aborted by the server recycling mechanism?
Here is the code:
namespace MyCore
{
[Serializable]
public class MyWebService : IMyWebService
{
AppDomain domain = AppDomain.CreateDomain("Domain");
Thread.CurrentThread.Name = "MVCThread";
domain.SetData("lDatabaseID", lDatabaseID);
domain.DoCallBack(() =>
{
long lID = Convert.ToInt64(AppDomain.CurrentDomain.GetData("lDatabaseID"));
Thread thread = new Thread(
(() =>
{
PopulateTables(lID );
}));
thread.Name = "DomThread";
thread.Start();
});
}
}
IIS is heavily optimised to respond very quickly to hundreds of small simultaneous requests and just isn't the right tool for what you're attempting. You can try to work around that but in the long term you'll be better off building a tool that is designed for long-running tasks. You've then got a pre-packaged solution the next time this problem arises.
The basic idea is to create an external application that does your background processing with some way to pass tasks to it and get results back. I like using the database to communicate as most web applications that need baground processing already use a database. Add a 'tasks' table with {status, startedDateTime, finishedDateTime, parameters, etc}, then write an external application that will periodically look for a new task, complete it and update the database. Your web site can poll the database for status or your application could make an AJAX call to notify the web site when a job has completed (a small iframe in the web site header that shows waiting / completed tasks can be useful if someone will be waiting for the job to complete and is easy to do).
EDIT: Before you do the above review HangFire (which works inside IIS, as a Windows Service or as a console app). Same principles, but a pre-packaged solution. Note that I haven't implemented this yet but it looks good.
Although it's a bit of work to set up, handing this task off to a Windows Service is a good approach if you might have multiple tasks and need them responded to quickly. There are a lot of tutorials on the web that will help you create a Windows Service, such as http://www.codeproject.com/Articles/106742/Creating-a-simple-Windows-Service but you'll have to build a simple task executor on top of that so if that's the way you want to go I'd look for a pre-built task engine (I couldn't find one quickly but I'm probably using the wrong search phrase).
But that's overkill if turn-around time isn't important and a better approach for you might be to create a small console application that will be started every five minutes by task scheduler. It would connect to the database, execute any waiting tasks then shut down again. This is easier to debug and install than a Windows service and achieves the same goal of moving the task execution out of IIS.
Remember that you still have to detect and handle Windows shutdown so that you don't get half-finished orphaned jobs - at the very least just tag that task as aborted and exit cleanly.
Alright after having mucked with Hangfire, I finally got it to work in .Net 4.0 and MVC 3. Had to install Common.Logging.Core 2.2.0, since the NuGet installed the wrong version (3.3.0)
In my Initial controller I added the following
namespace Core.Controllers
{
...
public void Configuration(IAppBuilder app)
{
app.UseHangfire(config =>
{
config.UseSqlServerStorage(ConnectionString.GetTVConnectionString());
config.UseServer();
});
}
...
}
ConnectionString.GetTVConnectionString() gets the connection string from the config file.
Up top I added the following
[assembly: OwinStartup(typeof(Core.Controllers.BaseController))]
In the code that starts the background thread I added the following, passing in a long instead of the class and having the job load the POCO class from the db.
BackgroundJob.Enqueue(() => PopulateTables(lDatabaseID, JobCancellationToken.Null));
The Enqueue() function returns a job id, that later can be used to cancel the job if needed, through the BackgroundJob.Delete(jobid) function.
In the job method I then have this
while (idxMin < max)
{
try
{
cancellationToken.ThrowIfCancellationRequested();
....
}
catch (JobAbortedException jobEx)
{
....
}
}
It's important to use dependency injection, so my class had a parameter less constructor added that re-reads the connection string rather than have it passed in.
public MyWebService ()
: this(ConnectionString.GetTVConnectionString())
{
}
public MyWebService (string sConnStr)
{
msConnStr = sConnStr;
}
After that it seems to run pretty well. A number of tables are added to the database specified in the connection string. So far it seems like the jobs survive recycling on the webserver.

How to run long-lasting process asynchronously under asp.net?

.net 4.5, asp.net mvc: What is the best way to run long-lasting process (1-2 minutes) from ASP.NET application giving it should be run in a single-threaded environment, I mean the process is initiated for one user at a time only, executions for all other users have to wait till the current execution is done? The scenario is the following: user clicks button that run some sort of long-lasting calculations, http response returned to the user immediately, then user has to request status of the calculations with separate request manually. Asp.net http session abortion should not lead to the process termination, it should keep going. The process might be run on the same or separate server.
I'll show you how to perform this task with http://hangfire.io – incredibly easy way to perform fire-and-forget, delayed and recurring tasks inside ASP.NET applications. No Windows Service required.
First, install the package through NuGet. If you have any problems, please see the Quick Start guide in the official documentation.
PM> Install-Package Hangfire
Open your OWIN Startup class and add the following lines:
public void Configure(IAppBuilder app)
{
GlobalConfiguration.Configuration.UseSqlServerStorage("connection_string");
app.UseHangfireDashboard();
app.UseHangfireServer();
}
Then write the method that will do the long-running work (I applied attribute to perform only one method at a time):
[DisableConcurrentExecution]
public void LongRunning()
{
// Some processing stuff
}
And then call a method in background as fire-and-forget to respond user immediately:
public ActionResult Perform()
{
BackgroundJob.Enqueue(() => LongRunning());
return View();
}
If you want to notify a user about job completion, consider using SignalR and append the LongRunning method correspondingly.
.Net 4.5.2 adds QueueBackgroundWorkItem that you can use to schedule a task. If you don't control the server (when it's rebooted), the 90 second default delay of appPool shut down won't work (unless you can detect the task didn't complete and run it on another server). For more details see "QueueBackgroundWorkItem to reliably schedule and run background processes in ASP.NET"
I would suggest using a product such as NServiceBus to offload the processing and run it in single threaded mode. The advantage to this is that all requests will be processed in order and the processing can be offloaded from the web server as you don't really want long running processes to happen on a web server.
If you control the server, and need more simplicity that a full framework like Hangfire, you can make a console app (.exe), and make any thing..., then you can call the .exe with Process.Start Method, you can call the .exe from SQL Server to, service, etc.

Categories

Resources