I am currently working on WebApi using .Net Core, one of my Api Method will call number of another Api (3rd party), and it will take some time to return response, but I don't want our Api consumers to wait for response, instead I want to return early response i.e The Operation is started. And I ll provide an endpoint to our Consumers through which they can get the status of that operation. For example our consumer calls the api to generate 100k records for which my Api will call around 20 parallel calls to 3rd party api. So I don't want consumer for these 20 apis response.
Currently I have this code:
public async Task<ActionResult> GenerateVouchers([FromBody][Required]CreateVoucherRequestModel request, string clientId)
{
_logger.LogInformation(Request.Method, Request.Path);
// await _voucherService.ValidateIdempotedKeyWithStatus(clientId, _idempotentHeader);
//TODO: Check Voucher type & Status before Generating Voucher
var watch = Stopwatch.StartNew();
var vouchers = new List<VoucherCreateResponseModel>();
var batchSize = 5000;
int numberOfBatches = (int)Math.Ceiling((double)request.quantity / batchSize);
int totalVoucherQuantity = request.quantity;
request.quantity = 5000;
var tasks = new List<Task<VoucherCreateResponseModel>>();
for (int i = 0; i < numberOfBatches; i++)
{
tasks.Add(_client.GenerateVoucher($"CouponsCreate", request));
vouchers.AddRange(await Task.WhenAll(tasks).ConfigureAwait(false));
}
// await _voucherService.GenerateVouchers(request, clientId, _idempotentHeader);
watch.Stop();
var totalMS = watch.ElapsedMilliseconds;
return Ok();
}
But the issue with above code even though I have ConfigureAwait(false), it waits for all 20 requests to execute and when response of all requests are returned than api consumer will get response, but each each of these 20 request will take around 5 seconds to execute, so our consumers may get request timeout while waiting for response.
How can I fix such issue in .Net Core.
It's not a good practice to wait for long running process inside controller.
My opinion is ,
put the data necessary (something like a Id for a batch) for long
running process to a Azure queue within the API
trigger a function app from the particular queue, So API's responsibility is
putting the data in to the queue
From there on it's function apps
responsibility to complete process .
May be using something like
signalR you can notify the frontend when process is completed
Related
I have a web app that connects to an external API.
That API has a limit of 3 connections per second.
I have a method that gets employee data for a whole factory.
It works fine, but I've found that if a particular factory has a lot of employees, I hit the API connection limit and get an error.
(429) API calls exceeded...maximum 3 per Second
So I decided to use await Task.Delay(1000) to set a 1 second delay, every time this method is used.
Now it seems to have reduced the number of errors I get, but I am still getting a few limit errors.
Is there another method I could use to ensure my limit is not reached?
Here is my code:
public async Task<YourSessionResponder> GetAll(Guid factoryId)
{
UserSession.AuthData sessionManager = new UserSession.AuthData
{
UserName = "xxxx",
Password = "xxxx"
};
ISessionHandler sessionMgr = new APIclient();
YourSessionResponder response;
response = await sessionMgr.GetDataAsync(sessionManager, new ListerRequest
{
FactoryId = factoryId;
await Task.Delay(1000);
return response;
}
I call it like this:
var yourEmployees = GetAll(factoryId);
I have a web app that connects to an external API.
Your current code limits the number of outgoing requests made by a single incoming request to your API. What you need to do is limit all of your outgoing requests, app-wide.
It's possible to do this using a SemaphoreSlim:
private static readonly SemaphoreSlim Mutex = new(1);
public async Task<YourSessionResponder> GetAll(Guid factoryId)
{
...
YourSessionResponder response;
await Mutex.WaitAsync();
try
{
response = await sessionMgr.GetDataAsync(...);
await Task.Delay(1000);
}
finally
{
Mutex.Release();
}
return response;
}
But I would take a different approach...
Is there another method I could use to ensure my limit is not reached?
Generally, I recommend just retrying on 429 errors, using de-correlated jittered exponential backoff (see Polly for an easy implementation). That way, when you're "under budget" for the time period, your requests go through immediately, and they only slow down when you hit your API limit.
From a comment on the question:
I am calling it like this: var yourEmployees = GetAll(factoryId);
Then you're not awaiting the task. While there's a 1-second delay after each network operation, you're still firing off all of the network operations in rapid succession. You need to await the task before moving on to the next one:
var yourEmployees = await GetAll(factoryId);
Assuming that this is happening in some kind of loop or repeated operation, of course. Otherwise, where would all of these different network tasks be coming from? Whatever high-level logic is invoking the multiple network operations, that logic needs to await one before moving on to the next.
I am using ManualResetEventSlim to have signaling mechanism in my application and It works great if requests/sec are 100. As I increase request/sec, it gets worse.
Example:
100 Requests/sec -> 90% transaction done in 250 ms and Throughput (Success request/sec) is 134.
150 Requests/sec -> 90% transaction done in 34067 ms and Throughput (Success request/sec) is 2.2.
I use ConcurrentDictionary as give below:
// <key, (responseString,ManualResetEventSlim) >
private static ConcurrentDictionary<string, (string, ManualResetEventSlim)> EventsDict = new ConcurrentDictionary<string, (string, ManualResetEventSlim)>();
Below given process describes need for ManualResetEventSlim (Api Solution 1 and Api Solution 2 are completely :
Api Solution 1 (REST Api) received a request, it added an element (null, ManualResetEventSlim) in ConcurrentDictionary against a key and called thirdparty service (SOAP) using async/await. Thirdparty soap api returned acknowledgement response but actual response is pending. After getting acknowledgement response, it goes to ManualResetEventSlim.wait
Once thirdparty processed the request, it calls Api Solution 2 (SOAP) using exposed method and sends actual response. Api solution 2 sends response to Api Solution 1 (REST Api) by making http request and then inserts data to database for auditlog.
Api Solution 1 will get key from response string and update response string in ConcurrentDictionary and set signal.
Api Solution 1 disposes ManualResetEventSlim object before returning response to client.
I think, you should be able to get rid of the blocking code by replacing (string, ManualResetEventSlim) with TaskCompletionSource<string>:
In Solution 1, you would do something along this:
TaskCompletionSource<string> tcs = new TaskCompletionSource<string>()
EventsDict.AddOrUpdate( key, tcs );
await KickOffSolution2ThirdParty( /*...*/ );
string result = await tcs.Task; // <-- now not blocking any thread anymore
And the counterpart:
void CallbackFromSolution2( string key, string result )
{
if( EventsDict.TryRemove(key, out TaskCompletionSource<string> tcs )
{
tcs.SetResult(result);
}
}
This is of course only a coarse outline of the idea. But hopefully enough to make my line of thought understandable. I cannot test this right now, so any improvements/corrections welcome.
I am newbie to .NET Core and asynchronous programming. I am trying to implement a console application to do the following:
The console application should work as intermediator between two external APIs. eg. API-1 and API-2.
It should call API-1 after every 10 milliseconds to get data.
Immediately call API-2 to submit the data that is received from API-1.
Console Application needs to wait for API-1 to receive data, but does not have to wait for the response from API-2.
Below is my code. It not working as expected. At first it invokes API-1 in 10 milliseconds as expected, but after that its invoking API-1 ONLY AFTER it receives response from API-2.
So assume API-2 takes 20 seconds, API-1 is also getting invoked after 20 seconds.
How do I make API-2 call asynchronous so it does not have to wait for API-2 response?
namespace ConsoleApp1
{
public class Program
{
private static Timer _timer;
private const int TIME_INTERVAL_IN_MILLISECONDS = 10; // 10 milliseconds
private const int API2_DELAY = 20000; // 20 seconds
public static void Main(string[] args)
{
Dowork().Wait();
Console.WriteLine("Press Any Key to stop");
Console.ReadKey();
Console.WriteLine("Done");
}
private static async Task Dowork()
{
var data = new SomeData();
_timer = new Timer(CallAPI1, data, TIME_INTERVAL_IN_MILLISECONDS, Timeout.Infinite);
await Task.Yield();
}
private static async void CallAPI1(object state)
{
var data = state as SomeData;
Console.WriteLine("Calling API One to get some data.");
data.SomeStringValue = DateTime.Now.ToString();
await CallAPI2(data);
_timer.Change(TIME_INTERVAL_IN_MILLISECONDS, Timeout.Infinite);
}
private static async Task CallAPI2(SomeData data)
{
Console.WriteLine("Calling API Two by passing some data received from API One " + data.SomeStringValue);
// the delay represent long running call to API 2
await Task.Delay(API2_DELAY);
}
}
}
POCO class
namespace ConsoleApp1
{
public class SomeData
{
public string SomeStringValue { get; set; }
}
}
Also note that API-1 and API-2 will be developed in ASP.NET Core 1
Update1
Let me rephrase above sentence. The API-1 would be developed in .Net core but API-2 would be windows workflow service. That means we can make multiple calls to WF. The WF will persist the requests and process one at a time.
Update2
After going through all the answers and links provided. I am thinking to use windows service as intermediator instead of console application. Right now .Net core does not support window service but has this nuget-package that can host .Net core inside windows service or I might use classic windows service using 4.6.2. I guess I can do the asyncrous implementation inside windows service as well.
There are a lot of things that I would do differently in this situation. Rather than using a timer I would use Task.Delay, also - I would most certainly wait for API2 to complete before attempting to throw more data at it. Additionally, I would ensure that my async methods are Task or Task<T> returning, notice your CallAPI1 call isn't, I understand it's a timer callback -- but that is another issue.
Consider the following:
async Task IntermediateAsync()
{
Console.WriteLine("Press ESC to exit...");
while (Console.ReadKey(true).Key != ConsoleKey.Escape)
{
var result = await _apiServiceOne.GetAsync();
await _apiServiceTwo.PostAsync(result);
// Wait ten milliseconds after each successful mediation phase
await Task.Delay(10);
}
}
This will act in the following manner:
Print a line instructing the user how to exit
Start loop
Get the result of API1
Pass the result to API2
Wait 10 milliseconds
[Step 2]
Finally, this is the same suggestion regardless of whether or not your using .NET Core. Any API interactions should follow the same guidelines.
Notes
Using a fire-and-forget on the second API call is simply setting your code up for failure. Since it is an API call there is more than likely going to be some latency with the I/O bound operations and one should assume that a tight loop of 10 milliseconds is only going to flood the availability on that endpoint. Why not simply wait for it to finish, what reason could you possibly have?
Remove the await when calling API2
private static async void CallAPI1(object state)
{
var data = state as SomeData;
Console.WriteLine("Calling API One to get some data.");
data.SomeStringValue = DateTime.Now.ToString();
//Before this will cause the program to wait
await CallAPI2(data);
// Now it will call and forget
CallAPI2(data);
_timer.Change(TIME_INTERVAL_IN_MILLISECONDS, Timeout.Infinite);
}
Edit:
As David points out, of course there is many way to solve this problem. This is not a correct approach to solve your problem.
Another method of doing things is use quartz.net
Schedule API1 as a repeating job
When API1 is done, schedule another job to run API2 as a standalone job
This way when API2 fails you can replay/repeat the job.
I have a number of producer tasks that push data into a BlockingCollection, lets call it requestQueue.
I also have a consumer task that pops requests from the requestQueue, and forwards async http requests to a remote web service.
I need to throttle or block the number of active requests sent to the web service. On some machines that are far away from the service or have a slower internet connection, the http response time is long enough that the number of active requests fills up more memory than I'd like.
At the moment I am using a semaphore approach, calling WaitOne on the consumer thread multiple times, and Release on the HTTP response callback. Is there a more elegant solution?
I am bound to .net 4.0, and would like a standard library based solution.
You are already using a BlockingCollection why have a WaitHandle?
The way I would do it is to have a BlockingCollection with n as it's bounded capacity where n is the maximum number of concurrent requests you want to have at any given time.
You can then do something like....
var n = 4;
var blockingQueue = new BlockingCollection<Request>(n);
Action<Request> consumer = request =>
{
// do something with request.
};
var noOfWorkers = 4;
var workers = new Task[noOfWorkers];
for (int i = 0; i < noOfWorkers; i++)
{
var task = new Task(() =>
{
foreach (var item in blockingQueue.GetConsumingEnumerable())
{
consumer(item);
}
}, TaskCreationOptions.LongRunning | TaskCreationOptions.DenyChildAttach);
workers[i] = task;
workers[i].Start();
}
Task.WaitAll(workers);
I let you take care of cancellation and error handling but using this you can also control how many workers you want to have at any given time, if the workers are busy sending and processing the request any other producer will be blocked until more room is available in the queue.
Background: We have import-functions that can take anywhere from a few seconds to 1-2 hours to run depending on the file being imported. We want to expose a new way of triggering imports, via a REST request.
Ideally the REST service would be called, trigger the import and reply with a result when done. My question is: since it can take up to two hours to run, is it possible to reply or will the request timeout for the caller? Is there a better way for this kind of operation?
What I use in these cases is an asynchronous operation that returns no result (void function result in case of c# Web API), then send the result asynchronously using a message queue.
E.g.
[HttpPut]
[Route("update")]
public void Update()
{
var task = Task.Run(() => this.engine.Update());
task.ContinueWith(t => publish(t, "Update()"));
}