In the previous day I am looking for a way to make my code fully asynchronous. So that when called by a rest API, I' ll get an immediate response meanwhile the process is running in the background.
To do that I simply used
tasks.Add(Task<bool>.Run( () => WholeProcessFunc(parameter) ))
where WholeProcessFunc is the function that make all the calculations(it may be computationally intensive).
It works as expected however I read that it is not optimal to wrap the whole process in a Task.Run.
My code need to compute different entity framework query which result depends on the previous one and contains also foreach loop.
For instance I can' t understand which is the best practice to make async a function like this:
public async Task<List<float>> func()
{
List<float> acsi = new List<float>();
using (var db = new EFContext())
{
long[] ids = await db.table1.Join(db.table2 /*,...*/)
.Where(/*...*/)
.Select(/*...*/).ToArrayAsync();
foreach (long id in ids)
{
var all = db.table1.Join(/*...*/)
.Where(/*...*/);
float acsi_temp = await all.OrderByDescending(/*...*/)
.Select(/*...*/).FirstAsync();
if (acsi_temp < 0) { break; }
acsi.Add(acsi_temp);
}
}
return acsi;
}
In particular I have difficulties with the foreach loop and the fact that the result of a query is used in the next .
Finally with the break statement which I don't get how to translate it. I read about cancellation token, could it be the way ?
Is wrapping up all this function in a Task.Run a solid solution ?
In the previous day I am looking for a way to make my code fully asynchronous. So that when called by a rest api, I' ll get an immediate response meanwhile the process is running in the background.
Well, that's one meaning of the word "asynchronous". Unfortunately, it's completely different than the kind of "asynchronous" that async/await does. async yields to the thread pool, not the client (browser).
It works as expected however I read that it is not optimal to wrap the whole process in a Task.Run.
It only seems to work as expected. It's likely that once your web site gets higher load, it will start to fail. It's definite that once your web site gets busier and you do things like rolling upgrades, it will start to fail.
Is wrapping up all this function in a Task.Run a solid solution ?
Not at all. Fire-and-forget is inherently dangerous.
A proper solution should be a basic distributed architecture:
A durable queue, such as an Azure Queue or Rabbit (if properly configured to be durable).
An independent processor, such as an Azure Function or Win32 Service.
Then the ASP.NET app will encode the work to be done into a queue message, enqueue that to the durable queue, and then return. Some time later, the processor will retrieve the message from that queue and do the actual work.
You can translate your code to return an IAsyncEnumerable<...>, that way the caller can process the results as they are obtained. In an asp.net 5 MVC endpoint, this includes writing serialised json to the browser;
public async IAsyncEnumerable<float> func()
{
using (var db = new EFContext())
{
//...
foreach (long id in ids)
{
//...
if(acsi_temp<0) { yield break; }
yield return acsi_temp;
}
}
}
public async Task<IActionResult> ControllerAction(){
if (...)
return NotFound();
return Ok(func());
}
Note that if your endpoint is an async IAsyncEnumerable coroutine. In asp.net 5, your headers would be flushed before your action even started. Giving you no way to return any http error codes.
Though for performance, you should try rework your queries so you can fetch all the data up front.
Related
I'm working on a API Web ASP.NET Core project with NET Core 5.0.0 and I'm using Azure.Storage.Queues 12.6.0 for writing and reading queue messages.
Everything works fine, but I was wondering if the way I read messages is OK or there is a better approach in terms of speed and efficiency.
This is the code I'm using. It's just a piece of code from a Microsoft tutorial, put inside a while() cycle. AzureQueue is just an instance of the QueueClient class.
public async Task StartReading()
{
while (true)
{
if (await AzureQueue.ExistsAsync())
{
QueueProperties properties = await AzureQueue.GetPropertiesAsync();
if (properties.ApproximateMessagesCount > 0)
{
QueueMessage[] retrievedMessage = await AzureQueue.ReceiveMessagesAsync(1);
string theMessage = retrievedMessage[0].MessageText;
await AzureQueue.DeleteMessageAsync(retrievedMessage[0].MessageId, retrievedMessage[0].PopReceipt);
}
}
}
}
Honestly, I'm not comfortable using an infinte while() cycle, because it appears to me as something I can't control and stop. But my personal feelings apart, is this kind of while() an acceptable approach or I should use something else? I just need to continue reading messages immediately as they arrive into the queue.
What you can do is place your code in an Azure function.
Then bind the azure function to trigger when there is something on the queue.
See: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue-trigger?tabs=csharp
I am having some severe performance issues in a project i'm working on. It's a standard web application project - users send requests to an API which trigger some form of computation in various handlers.
The problem right now is pretty much any request will drive the CPU usage of the server up significantly, regardless of what internal computation the corresponding function is supposed to do. For example, we have an endpoint to display a game from the database - the user sends a request containing an ID and the server will respond with a JSON-object. When this request is being processed the CPU usage goes from 5% (with the app just running) to 25-30%. Several concurrent requests will tank the server, with .net-core using 60-70% of the CPU.
The request chain looks like:
(Controller)
[HttpGet("game/{Id}")]
public async Task<IActionResult> GetPerson(string Id)
{
try
{
var response = await _GameService.GetGameAsync(Id);
return Ok(new FilteredResponse(response, 200));
}
Service
public async Task<PlayerFilteredGameState> GetGameAsync(string gameId, string apiKey)
{
var response = await _ironmanDataHandler.GetGameAsync(gameId);
var filteredGame = _responseFilterHelper.FilterForPlayer(response, apiKey);
return filteredGame;
}
Data handler
public async Task<GameState> GetGameAsync(string gameStateId)
{
using (var db = _dbContextFactory.Create())
{
var specifiedGame = await db.GameStateIronMan.FirstOrDefaultAsync(a => a.gameId == gameStateId);
if (specifiedGame == null)
{
throw new ApiException("There is no game with that ID.", 404);
}
var deserializedGame = JsonConvert.DeserializeObject<GameState>(specifiedGame.GameState);
return deserializedGame;
}
}
I've tried mocking all function return values and database accesses, replacing all computed values with null/new Game() etc etc but it doesn't improve the performance. I've spent lots of time with different performance analysis tools but there isn't a single function that uses more than 0,5-1% of the CPU.
After a lot of investigation the only "conclusion" i've reached is that it seems to have something to do with the internal functionality of async/await and the way we use it in our project, because it doesn't matter what we do in the called functions - as soon as we call a function the performance takes a huge hit.
I also tried making the functions synchronous just to see if there was something wrong with my system, however performance is massively reduced if i do that (which is good, i suppose).
I really am at a loss here because we aren't really doing anything out of the ordinary and we're still having large issues.
UPDATE
I've performed a performance analysis in ANTS. Im not really sure how to present the results, so i took a picture of what the callstack looks like.
If your gamestate is a large object, deserializing it can be quite taxing.
You could create a test where you just deserialize a saved game state, and do some profiling with various game states (a fresh start, after some time, ...) to see if there are differences.
If you find that deserializing takes a lot of CPU no matter what, you could look into changing the structure and seeing if you can optimize the amount of data that is saved
I am consuming a web service provided to me by a vendor in c# application. This application calls a web method in a loop and that slows down the performance. To get the complete set of results, it takes more than an hour.
Can I apply multi threading on my side to consume this web service in multiple threads and combine the results together?
Is there any better approach to retrieve data in minutes instead of hours?
First of all you have to make sure your vendor does indeed support this or does not prohibit it (which is very probable too).
The code itself to do this is fairly straightforward, using a method such as Parallel.For
Simple Example (google.com):
Parallel.For(0, norequests,
i => {
//Code that does your request goes here
} );
Exaplanation:
In a Parallel.For loop, all the requests get executed in-parallel (as implied in the name), which could potentially provide a very significant increase in performance.
Further reading:
MSDN on Parallel.For loops
You should really ask your vendor. We can only speculate about why it takes that long or if firing multiple requests will actually yield the same results as the one that takes long.
Basically, sending one request getting one response should beat the multi-threaded variant because it should be easier to optimize on the servers side.
If you want to know why this is not the case with the current version of the service, ask the vendor.
This is only samples, if you call web services in parallel:
private void TestParallelForeach()
{
string[] uris = {"http://192.168.1.2", "http://192.168.1.3", "http://192.168.1.4"};
var results = new List<string>();
var syncObj = new object();
Parallel.ForEach(uris, uri =>
{
using (var webClient = new WebClient())
{
webClient.Encoding = Encoding.UTF8;
try
{
var result = webClient.DownloadString(uri);
lock (syncObj)
{
results.Add(result);
}
}
catch (Exception ex)
{
// Do error handling here...
}
}
});
// Do with "results" here....
}
I have a website on Rackspace which does calculation, the calculation can take anywhere from 30 seconds to several minutes. Originally I implemented this with SignalR but had to yank it due to excessive CC usage. Hosted Rackspace sites are really not designed for that kind of use. The Bill went though the roof.
The basic code is as below which work perfectly on my test server but of course gets a timeout error on Rackspace if the calculation take more than 30 seconds due to their watcher killing it. (old code) I have been told that the operation must write to the stream to keep it alive. In the days of old I would have started a thread and polled the site until the thread was done. If there is a better way I would prefer to take it.
It seems that with .NET 4.5 I can use the HttpTaskAsyncHandler to accomplish this. But I'm not getting it. The (new code) below is as I understand the handler you would use by taking the old code in the using and placing it in the ProcessRequestAsync task. When I attempt to call the CalcHandler / Calc I get a 404 error which most likely has to do with routing. I was trying to follow this link but could not get it to work either. The add name is "myHandler" but the example link is "feed", how did we get from one to the other. They mentioned they created a class library but can the code be in the same project as the current code, how?
http://codewala.net/2012/04/30/asynchronous-httphandlers-with-asp-net-4-5/
As a side note, will the HttpTaskAsyncHandler allow me to keep the request alive until it is completed if it takes several minutes? Basically should I use something else for what I am trying to accomplish.
Old code
[Authorize]
[AsyncTimeout(5000)] // does not do anything on RackSpace
public async Task<JsonResult> Calculate(DataModel data)
{
try
{
using (var db = new ApplicationDbContext())
{
var result = await CalcualteResult(data);
return Json(result, JsonRequestBehavior.AllowGet);
}
}
catch (Exception ex)
{
LcDataLink.ProcessError(ex);
}
return Json(null, JsonRequestBehavior.AllowGet);
}
new code
public class CalcHandler : HttpTaskAsyncHandler
{
public override System.Threading.Tasks.Task ProcessRequestAsync(HttpContext context)
{
Console.WriteLine("test");
return new Task(() => System.Threading.Thread.Sleep(5000));
}
}
It's not a best approach. Usually you need to create a separate process ("worker role" in Azure).
This process will handle long-time operations and save result to the database. With SignalR (or by calling api method every 20 seconds) you will update the status of this operation on client side (your browser).
If this process takes too much time to calculate, your server will become potentially vulnerable to DDoS attacks.
Moreover, it depends on configuration, but long-running operations could be killed by the server itself. By default, if I'm not mistaken, after 30 minutes of execution.
We are building a highly concurrent web application, and recently we have started using asynchronous programming extensively (using TPL and async/await).
We have a distributed environment, in which apps communicate with each other through REST APIs (built on top of ASP.NET Web API). In one specific app, we have a DelegatingHandler that after calling base.SendAsync (i.e., after calculating the response) logs the response to a file. We include the response's basic information in the log (status code, headers and content):
public static string SerializeResponse(HttpResponseMessage response)
{
var builder = new StringBuilder();
var content = ReadContentAsString(response.Content);
builder.AppendFormat("HTTP/{0} {1:d} {1}", response.Version.ToString(2), response.StatusCode);
builder.AppendLine();
builder.Append(response.Headers);
if (!string.IsNullOrWhiteSpace(content))
{
builder.Append(response.Content.Headers);
builder.AppendLine();
builder.AppendLine(Beautified(content));
}
return builder.ToString();
}
private static string ReadContentAsString(HttpContent content)
{
return content == null ? null : content.ReadAsStringAsync().Result;
}
The problem is this: when the code reaches content.ReadAsStringAsync().Result under heavy server load, the request sometimes hangs on IIS. When it does, it sometimes returns a response -- but hangs on IIS as if it didn't -- or in other times it never returns.
I have also tried reading the content using ReadAsByteArrayAsync and then converting it to String, with no luck.
When I convert the code to use async throughout I get even weirder results:
public static async Task<string> SerializeResponseAsync(HttpResponseMessage response)
{
var builder = new StringBuilder();
var content = await ReadContentAsStringAsync(response.Content);
builder.AppendFormat("HTTP/{0} {1:d} {1}", response.Version.ToString(2), response.StatusCode);
builder.AppendLine();
builder.Append(response.Headers);
if (!string.IsNullOrWhiteSpace(content))
{
builder.Append(response.Content.Headers);
builder.AppendLine();
builder.AppendLine(Beautified(content));
}
return builder.ToString();
}
private static Task<string> ReadContentAsStringAsync(HttpContent content)
{
return content == null ? Task.FromResult<string>(null) : content.ReadAsStringAsync();
}
Now HttpContext.Current is null after the call to content.ReadAsStringAsync(), and it keeps being null for all the subsequent requests! I know this sounds unbelievable -- and it took me some time and the presence of three coworkers to accept that this was really happening.
Is this some kind of expected behavior? Am I doing something wrong here?
I had this problem. Although, I haven't fully tested yet, using CopyToAsync instead of ReadAsStringAsync seems to fix the problem:
var ms = new MemoryStream();
await response.Content.CopyToAsync(ms);
ms.Seek(0, SeekOrigin.Begin);
var sr = new StreamReader(ms);
responseContent = sr.ReadToEnd();
With regards to your second issue, the async/await is syntactic sugar for the compiler building a state machine where the call to to a function preceded by "await" returns immediately on the current thread...one that contains HttpContext.Current in its thread local storage. The completion of that async call can occur on a different thread...one that does NOT have HttpContext.Current in its thread local storage.
If you want the completion to execute on the same thread (thus having the same objects in thread local storage like HttpContext.Current), then you need to be aware of this behavior. This is especially important on calls from the main UI thread (if you're building a Windows application) or in ASP.NET, calls from an ASP.NET request thread where you are dependent on HttpContext.Current.
See reference docs on ConfigureAwait(false). Also, view some Channel 9 tutorials on TPL. Once the "easy" stuff is grokked, the presenter will invariably talk about this issue as it causes subtle problems that are not easily understood unless you know what the TPL is doing underneath the covers.
Good luck.
With regards to your first problem, if the caller gets a result, I'm not convinced that IIS has not completed the request. How are you determining that the ASP.NET request thread initiated by this caller is hung in IIS?