I have a few server-side processes that can take very long times to run (30-60m). These are a result of calling a web API from my webserver, and throttling the requests to match the maximum request rate of the service.
I want to be able to display the progress of this on the client, how can I go about doing this? Where do I start?
Previous answers have hinted at SignalR, but that does not seem to be a thing with Asp.Net Core.
Edit: It looks like SignalR is a thing with Asp.Net core. As this is the case, I have a followup question:
I'm using AngularJS and managing the client-side state via AJAX requests to API endpoints. Are there any potential coding or performance "gotchas" when trying to also utilize SignalR in conjunction?
Additionally, are there alternatives that don't add SignalR as a dependency? It looks like it requires a beta-version of .Net Core, I'd rather stay in stable-release-land if at all possible.
You can use SignalR and the IProgress interface to handle this. The basic implementation is something like this:
Server-side
[HubName("messageHub")]
public class MessageHub : Hub // This is your server-side SignalR
{
public async Task Countdown(IProgress<int> progress) // This is your IProgress
{
// 60 second countdown for an example
// NOT TESTED
for (int i = 1; i <= 61; i += 1)
{
await Task.Delay(1000);
progress.Report(i);
}
}
}
Client-side
// Hub connection
//
var messageConnection = $.hubConnection(url);
var messageHub = messageConnection.createHubProxy('MessageHub');
$.connection.hub.logging = true
Related
So, I need to implement a Consumer in a WebAPI (.Net core 3.1) application, and reading the Microsoft Documentations and seeing several videos about it, I got to this solution.
This is an extension method for IServiceCollection, I'm calling it from the Startup.cs to instantiate my Consumer (the connection strings and container names are there for tests only):
private static async Task AddPropostaEventHub(this IServiceCollection services)
{
const string eventHubName = "EVENT HUB NAME";
const string ehubNamespaceConnectionString = "EVENT HUB CONNECTION STRING";
const string blobContainerName = "BLOB CONTAINER NAME";
const string blobStorageConnectionString = "BLOB CONNECTION STRING";
string consumerGroup = EventHubConsumerClient.DefaultConsumerGroupName;
BlobContainerClient storageClient = new BlobContainerClient(blobStorageConnectionString, blobContainerName);
EventProcessorClient processor = new EventProcessorClient(storageClient, consumerGroup, ehubNamespaceConnectionString, eventHubName);
processor.ProcessEventAsync += ProcessEvent.ProcessEventHandler;
processor.ProcessErrorAsync += ProcessEvent.ProcessErrorHandler;
await processor.StartProcessingAsync();
}
The ProcessorEventHandler class:
public static class ProcessEvent
{
public static async Task ProcessEventHandler(ProcessEventArgs eventArgs)
{
var result = Encoding.UTF8.GetString(eventArgs.Data.Body.ToArray());
//DO STUFF
await eventArgs.UpdateCheckpointAsync(eventArgs.CancellationToken);
}
public static Task ProcessErrorHandler(ProcessErrorEventArgs eventArgs)
{
//DO STUFF
return Task.CompletedTask;
}
}
This code is working, but my question is: is it okay to implement it like that? Is there a problem if the consumer nevers stops? Can it block other tasks (or requests) in my code?
Is there a better way to implement it using Dependecy Injection in .Net Core?
I couldn't find any example of someone implementing in a WebApi, is there a reason for that?
As Jesse Squire mentioned, WebAPI isn't necessarily the correct method of implementation, but it primarily depends on what your goals are.
If you are making an API that also includes an Event Hub listener, you should implement it under the IHostedService interface. Your existing AddPropostaEventHub() method goes inside the interface's StartAsync(CancellationToken cancellationToken) which is what .NET Core uses to startup a background task. Then, inside your Startup.cs, you register the handler as services.AddHostedService<EventHubService>();. This ensures that the long running listener is handled properly without blocking your incoming HTTP requests.
If you don't also have an API involved or are able to split the processes completely, then you should consider creating this as a console app instead of as a hosted service, which further separates the roles of API and event listener.
You didn't mention where you are deploying this, but if you happen to be deploying to an Azure App Service, you do have a few options for splitting the receiver from your API, and in that case I would definitely recommend doing so. Inside of App Services, there is a feature called WebJobs which specifically exists to handle things like this without making it part of your API. A good alternative is Functions. In that case you don't have to worry about setting up the DI for Event Hub at all, the host process takes care of that for you.
We have a custom implementation of IStringLocazlizer that Loads labels from our internal company CMS that exposes data via HTTP Rest interface.
We wanted to use NET Core built in locazlier but I do not like the GetAllStrings Sync method that will have to Block on Tasks to perfrom HTTP Call.
We have a cache obvioulsy but I do think that it does not seem right.
Any thoughts on that?
Example:
public IEnumerable<LocalizedString> GetAllStrings(bool includeParentCultures)
{
Task<CmsLabelsModel> task = pipeline.SendAsync(new GetLabelsModelRequest(ResourceName));
CmsLabelsModel result = task.GetAwaiter().GetResult(); //Yuk Yuk
return result.LabelModels.Select(pair => new LocalizedString(pair.Key, pair.Value.Value));
}
Is AsyncForwardingAppender of the Log4Net.Async package safe to use in an ASP.NET MVC Web Application? I'm worried that it will clog up the available threads.
It comes down to me wanting to make a call to an async method to send the logs to an HTTP API. I could use an async void method like the way this guy did it:
protected override async void Append(log4net.Core.LoggingEvent loggingEvent)
{
var message = new SplunkMessage(loggingEvent.Level, loggingEvent.ExceptionObject);
var success = await _splunkLogger.Report(message);
//do something with the success status, not really relevant
}
He later updated his code:
public void DoAppend(log4net.Core.LoggingEvent loggingEvent)
{
var clientIp = _clientIpRetriever.GetClientIp();
var message = new SplunkMessage(loggingEvent.Level, loggingEvent.ExceptionObject, clientIp);
SendMessageToSplunk(message);
}
private async void SendMessageToSplunk(SplunkMessage message)
{
try
{
var success = await _splunkLogger.Report(message);
//do something unimportant
}
catch(Exception x)
{
LogLog.Error(GetType(), "Error in SplunkAppender.", x);
}
}
But I'm too scared to actually try it because of the dangers involved: "First off, let me point out that "fire and forget" is almost always a mistake in ASP.NET applications".
Any suggestions?
Looking at the source you can see that the AsyncForwardingAppender uses only one thread to dequeue the events. Using it won't kill your MVC app since only one thread will be used.
Regarding "Fire and forget" as a bad pattern in web apps, you have to add a grain of salt to the statement since the answer talks about the danger of letting a functional operation go unsupervised, not a logging one. Logging should be able to fail without your application ceasing working (which is why log4net never says anything when configuration or logging fails)
Good day!
I have simple wcf service and desktop client,wich uses some functions.
But my client fault with exception that my wcf operation goes too long.
Some code:
client=new MyWcfServiceClient();
client.DoWork(param1,param2..etc);
I dont need to wait while DoWork do work. I need to execute this method and go forward. It is like send command to do some work and i dont need to get result immediately.
How to do that?
Thank you!
P.S. server side code looks like:
DoWork(param1,etc)
{
// do long-term work at same thread
}
P.P.S. i ingnore the result.
On the service, move your logic from: DoWork(param1, etc) to another method e.g. DoWorkBackground(...)
static Task _backgroundTask;
void DoWork(param1, etc)
{
if (_backgroundTask == null || _backgroundTask.IsCompleted)
_backgroundTask = System.Threading.Tasks.Task.Run(
() => DoWorkBackground(param1, etc) );
}
void DoWorkBackground(param1, etc)
{
//long running logic
}
You could do something along:
private static Task DoWork()
{
// do stuff here
// return something (?) assuming you need or care about this
}
and then you can do something like
Task task = DoWork().Start;
// you can wait, poll, or if you don't care, ignore the result
Using .NET 4.5, you can create task-based proxies while adding service reference to an application. This can be done using by clicking on the Advanced button in the Add Service Reference dialogue. See here for more info.
Once this is done you will get the following: DoWorkAsync
Using this new proxy you can do things like this:
var result = await client.DoWorkAsync(param1,param2..etc);
If you can modify your DoWork operation, define it as OneWay, so it will return immediately. If DoWork is a web service, the HTTP code 202 will be returned.
I'm using TPL to send emails to the end-users without delaying the api response, i'm not sure which method should be used since im dealing with the db context here. I did method 2 because i wasn't sure that the db context would be available by the time the task gets to run, so a created a new EF object, or maybe im doing it all wrong.
public class OrdersController : ApiController {
private AllegroDMContainer db = new AllegroDMContainer();
public HttpResponseMessage PostOrder(Order order) {
// Creating a new EF object and adding it to the database
Models.Order _order = new Models.Order{ Name = order.Name };
db.Orders.Add(_order);
/* Method 1 */
Task.Factory.StartNew(() => {
_order.SendEmail();
});
/* Method 2 */
Task.Factory.StartNew(() => {
Models.Order rOrder = db.Orders.Find(_order.ID);
rOrder.SendEmail();
});
return Request.CreateResponse(HttpStatusCode.Created);
}
}
Both methods are wrong, because you're starting a fire-and-forget operation on a pool thread inside the ASP.NET process.
The problem is, an ASP.NET host is not guaranteed to stay alive between handling HTTP responses. E.g., it can be automatically recycled, manually restarted or taken out of the farm. In which case, the send-mail operation would never get completed and you wouldn't get notified about it.
If you need to speed up the response delivery, consider outsourcing the send-mail operation to a separate WCF or Web API service. A related question: Fire and forget async method in asp.net mvc.