I have a client program which sends off to a webapi and gets a websocket back.
Once hooked up I then send off a second request to the api to start a number of tests.
The webapi then hooks into the websocket as well. sending messages as it completes the tasks. I do this using System.Net.Websockets.ClientWebSocket
var cts = new CancellationTokenSource();
var socket = new ClientWebSocket();
string wsUri = "wss://localhost:44301/api/v1/devices/registerForSocket";
await socket.ConnectAsync(new Uri(wsUri), cts.Token);
RunTests(deviceid, socket, cts);
Now I need to add the ability for the client to also poll for a current progress update. Basically, this will be a bool which is set to false when a process is being run and then reverted back to true once its finished.
Now to do this I will have to run a seperate ClientWebSocket for receiving the "PROGRESS" messages.
However, how can I share the progress between the two tasks? Is this be best way to go about it?
Related
I have an APS.NET Core 5 Web API:
I have this:
await _mailSender.SendMailAsync(entity.Id);
But, I don't want to wait for the email sender to return Ok for client.
I want to continue code execution and send email in background.
How can I do this?
I did this using HangFire to queue the email to send.
//send email
BackgroundJob.Enqueue(() => _mailSender.SendMailAsync(entity.Id));
By using Task class, you can execute a single operation asynchronously.
Try this code:
System.Threading.Tasks.Task.Run(() => _mailSender.SendMailAsync(entity.Id).Result;
The work performed by a Task object typically executes asynchronously on a thread pool thread rather than synchronously on the main application thread.
We have a function app that build a large json payload(+- 2000 lines) everyday and posts it to the api to be mapped and saved into a database.
We are using cqrs with mediatr and it seems the API side takes exceptionally long to create and save all the neccesary information.
The problem we have is that the function's postasjsonasync waits for the api response and times out after a few minutes.
Any idea how to run this as a background task or just post and forget? Our API is only concerned that it received data.
Function side:
using (var client = new HttpClient())
{
client.Timeout = new TimeSpan(0, 10, 0);
var response = await client.PostAsJsonAsync($"{endpoint}/api/v1.0/BatchImport/Import", json); <-- Times out waiting for API
response.EnsureSuccessStatusCode();
}
API mediatr handle side:
public async Task<Unit> Handle(CreateBatchOrderCommand request, CancellationToken cancellationToken)
{
foreach (var importOrder in request.Payload) <-- Takes long to process all the data
{
await PopulateImportDataAsync(importOrder, cancellationToken);
await CreateOrderAsync(importOrder, cancellationToken);
}
return Unit.Value;
}
Cheers
The problem we have is that the function's postasjsonasync waits for the api response and times out after a few minutes.
The easiest solution is going to be just increasing that timeout. If you are talking about Azure Functions, I believe you can increase the timeout to 10 minutes.
Any idea how to run this as a background task or just post and forget? Our API is only concerned that it received data.
Any fire-and-forget solution is not going to end well; you'll end up with lost data. I recommend that you not use fire-and-forget at all, and this advice goes double as soon as you're in the cloud.
Assuming increasing the timeout isn't sufficient, your solution is to use a basic distributed architecture, as described on my blog:
Have your API place the incoming request into a durable queue.
Have a separate backend (e.g., Azure (Durable) Function) process that request from the queue.
Assuming you’re on .NET Core, you could stick incoming requests into a queued background task:
https://learn.microsoft.com/en-us/aspnet/core/fundamentals/host/hosted-services?view=aspnetcore-6.0&tabs=visual-studio#queued-background-tasks
Keep in mind this chews up resources from servicing other web requests so it will not scale well with millions of requests. This same basic principle, a message queue item and offline processing, can also be distributed across multiple services to take some of the load off the web service.
I send a message using PromptCustomDialog. If a person cannot answer a question for some time, how can the next message be sent? I would be grateful for the examples.
await context.Forward(new PromptCustomDialog(message, answers), Complete, context.MakeMessage(), CancellationToken.None);
public async Task Complete(IDialogContext context, IAwaitable<string> result)
{
var res = await result;
string response = res;
await Choose(context, response);
}
This would require you to set some kind of timer that would trigger an event that would cause the bot to send out a proactive message to the user. You can read more about sending proactive messages here.
The only thing I would point out is that bots, like web services, are often running multiple instances across multiple servers (e.g. if you're deployed on Azure App Services), so you would need to use some kind of distributed, stateful timer service to help you with this to ensure that the timer fires and triggers the event no matter what server it originated from.
I'm using C# with .NET Framework 4.5. I'm writing a server application that should be able to connect to arbitrarily many clients at once. So I have one thread that will listen for connections, and then it will send the connection to a background thread to go into a loop waiting for messages. Since the number of supportable client connections should be very high, spawning a new thread for every connection won't work. Instead what I need is a thread pool. However, I don't want to use the system thread pool because these threads will essentially be blocked in a call to Socket.Select indefinitely, or at least for the life of the connections they host.
So I think I need a custom ThreadPool that I can explicitly round-robin the connections over to. How to achieve this in C#?
There's no point in using threads for this - that's just wasting resources.
Instead, you want to use asynchronous I/O. Now, ignoring all the complexities involved with networking, the basic idea is something like this:
async Task Loop()
{
using(var client = new TcpClient())
{
await client.ConnectAsync(IPAddress.Loopback, 6536).ConfigureAwait(false);
var stream = client.GetStream();
byte[] outBuf = new byte[4096];
// The "message loop"
while (true)
{
// Asynchronously wait for the message
var read = await stream.ReadAsync(outBuf, 0, outBuf.Length);
// Handle the message, asynchronously send replies, whatever...
}
}
}
You can run this method for each of the connection you're making (without using await - very important). The thread handling that particular socket will be released on every await - the continuation of that await will be posted on a thread-pool thread.
The result being, the amount of threads used at any given time will tend to self-balance with the available CPU cores etc., while you can easily service thousands of connections at a time.
Do not use the exact code I posted. You need to add tons of error-handling code etc. Handling networking safely and reliably is very tricky.
We're planning our system to have a set of publicly accessible services which call into a set of internal services, all implemented using ServiceStack.
My question is, what is the best method (in terms of performance, stability and code maintanability) for this cross-service communication?
E.g. should my public services call the internal services using a ServiceStack client or use the Rabbit / Redis messaging system? And if the latter, can I call two or more internal services asynchronously and await for the response from both?
For one-way communications Messaging offers a lot of benefits where if installing a Rabbit MQ Broker is an option, Rabbit MQ provides the more industrial strength option.
For request/reply services where requests are transient and both endpoints are required to be up, the typed C# Service Clients allow for more direct/debuggable point-to-point communications with less moving parts.
Using the clients async API's let you easily make multiple calls in parallel, e.g:
//fire off to 2 async requests simultaneously...
var task1 = client.GetAsync(new Request1 { ... });
var task2 = client.GetAsync(new Request2 { ... });
//additional processing if any...
//Continue when first response is received
var response1 = await task1;
//Continue after 2nd response, if it arrived before task1, call returns instantly
var response2 = await task1;
The above code continues after Request1 is completed, you can also use Task.WhenAny() to process which ever request was completed first.