What is the performance difference, if any, between a regular request in C# .net and an asychronous one?
Depends on your use case. While there is no immediate performance benefit from using async in simple scenarios, it can be crucial in more complex ones, and for scalability.
For instance, sending multiple requests to many servers is obviously best done in parallel, this can be handled using async.
Regarding scalability, consider a web application that uses sync web requests to communicate with slow external servers. Since IIS only allocates a limited amount of threads to serve requests from users, when the number of users grow, there is a risk that all user threads will be blocked while waiting for the external web requests. This means that some user requests will be rejected by IIS.
It depends, in asp.net you must use synchronous (until 4.5); but in a Windows Form or any other type of C# project though, it prevents the thread from being on hold. Calling the ASYNC complete event, would allow you to update the proper information without putting the UI or the Main thread on hold.
under asychronous mode, The server can only reply what you want, this will minimize the network triffic, then minimize the response time, improve the user experience.
Related
I usually use many EF Core async methods in my web application like this:
await db.Parents.FirstOrDefaultAsync(p => p.Id == id);
As we know, initial number of threads in ThreadPool by default is limited to number of CPU logical cores. Also user requests are handled by threads in ThreadPool.
Should I worry about handling user requests or performance issues due to many async calls in my application?
Should I worry about handling user requests or performance issues due to many async calls in my application?
EF Core provides an Async query interface for repositories. Whether it's async-all-the-way, or whether certian methods block thread pool threads is dependent on the EF provider. SQLServer's SqlClient has task-based Async methods that don't block threads. Most other providers do too. But for instance for the EF in-memory provider, or perhaps the SQLite provider it may be async-over-sync, either completing synchronously and returning a completed Task, or blocking a thread pool thread.
So EF normally won't block your threads. And when you make an Async call to the database it frees your application's thread to do more work. Like handle additional requests. If you have too many concurrent requests to your database, each request will start to take more time.
When this happens you need to have a mechanism to slow down the rate of new requests to the database, otherwise you you'll get into a bad state. EG where the database server is has 2000 running requests, most of which are on behalf of clients who've given up and timed out. And new requests aren't handled in a timely manner because of all the old requests.
Generally throughput increases as you add concurrency up to a point, but beyond that point overall throughput decreases, sometimes drastically. Something like this:
It’s up to you to limit overall concurrency to prevent severe degradation in throughput. It’s better to fail some requests early (eg with an HTTP 503) than accept them all and not complete any within your SLA.
One of the benefits of using synchronous database access is that it occupies an application thread for the duration of the database interaction, automatically adding backpressure to the request flow. Having a request have to wait for a thread pool thread when all of the thread pool threads are busy is actually a good thing. When you go all async this control goes away and you need to think about replacing it.
ASP.NET Core currently has no built-in throttling. Your web server host may have some, and, for instance, SqlConnection's connection pool limit serves to limit the number of concurrent requests per application instance. But you've got to have something that allows you to handle a surge in request volume in an orderly fashion.
I'm new to .Net WebApi, as I know it's preferable to use async APIs instead of sync ones, but what is the difference?
if the API is sync and has been called from a client, and another call from another client, as I checked, no interruption will happen, and both calls will go through simultaneously. So, what's the benefit of making it Async?
Update: as I understand, if the number of requests are huge, if I use async, the waiting time for some calls will be less, cause there are more threads available to run tasks(as some of them are released waiting for database call, or network call etc.) is it true?
I case of SYNC what happens is that for each request a thread is assigned exclusively and this thread is released only upon completion of particular request.
While in case of ASYNC the thread may be reused by other request.
So if your application is I/O Bound then you can see significant improvement in your application by using ASYNC, if your application is CPU Bound then ASYNC will not be that much useful.
https://en.wikipedia.org/wiki/I/O_bound
https://en.wikipedia.org/wiki/CPU-bound
First of all re-iterating the difference between sync and async.
{------------- Sync Task1-------------}{-------------- Sync Task 2------------}
{---------------------- async task 1 --------------------}
{---------------------- async task 2 --------------------}
I hope you got your answer by this point to why its beneficial. Imaging a situaion where your API serving list of 1000 basketball payers and their details while requests comings in for list of cities. I bet your client app would look neater if you get something while player list is been served wouldn't it?
Secondly, APIs don't prefer Aync as such. Its a choice of programming you have. If you utilise the full language and operating system capabilities, It's your application and the users going to get the benefit out of it.
CACHING,
Using Async does help caching if you are use new In memory Cache or custom server-level cache. Afterall your client is looking for 304 return and while a long request is been served a small requests can be served ie cache checks.
I'm a bit confused, my ASP.NET MVC app will be hosted on a server, so is there any point in making it multi-threaded? For example, if I want one thread to execute my translations, is this a good idea? Can someone elaborate this to me please? I'm a bit confused with web apps multi-threading versus desktop apps multi-threading.
There's a few things to this.
The first is that every ASP.NET application (MVC or otherwise) is inherently multi-threaded: Each request will be processed on a separate thread, so you are automatically in a multi-threading situation and must consider this with any shared access to data (e.g. statics, etc.).
Another is that with MVC its particularly easy to write asynchronous controller methods like:
public async Task<ActionResult> Index(int id)
{
var model = await SomeMethodThatGetsModelAsync(id);
return View(model);
}
Now, if we're already multi-threaded then why bother? The benefit is (ironically in a way) to use fewer threads. Assuming that SomeMethodThatGetsModel(id) may block or otherwise hold up the thread, awaiting on SomeMethodThatGetsModelAsync(id) allows the current thread to handle another request. One of the limits on how many requests a webserver can handle is how many threads it can have handling those requests. Free up threads and you increase your throughput.
A further is that you may want some operation to happen in the background of the application as a whole, here the reason is the same as with desktop applications.
Simpilarly, if you have work that can be done simultaneously and which blocks (e.g. hit a database and two webservices) then your reason for doing so in a multi-threaded manner is the same as with a desktop app.
(In the last two cases though, be wary of using the default static thread pool, such as through ThreadPool.QueueUserWorkItem or Task.Run. Because this same thread pool is used for the main ASP.NET threads if you hit it heavily you're eating from the same plate as your framework. A few such uses is absolutely fine, but if you're making heavy use of separate threads then use a separate set of threads for them, perhaps with your own pooling mechanism).
is there a point to make it Multi-Threaded?
that's won't work. The question is: does your application needs multi-threading? For example, if you receive a collection of big entities, that need to be preprocessed somehow before further actions, you might process each of them in separate thread instead of cycle.
Im a bit confused with web apps multi threading vs desktop apps multi threading
Multithreading in asp.net and desktop are the same thing and works the same way.
Async has become a buzzword in .net and MS have introduced it in Web API 2 so that more requests can be handled whilst others are waiting on IO to finish.
Whilst I can see the benefit of this, is it really a concern? A x64 architecture has 30000+ threads in the Thread Pool so unless you have that many concurrent users on your website is async really required? Even if you have that many concurrent users without caching I'm pretty sure SQL Server will fall over with that many requests?
Apart from it being shiny when is there a real need to have async routing on a web framework?
Many of the other answers here are coming from a UI (desktop/mobile app) perspective, not a web server perspective.
Async has become a buzzword in .net and MS have introduced it in Web API 2 so that more requests can be handled whilst others are waiting on IO to finish.
async and await were introduced in .NET 4.5 / VS 2012. However, ASP.NET has had asynchronous request capability since .NET 2.0 - a very long time ago. And there have been people using it.
What async and await bring to the table is asynchronous code that is easy to maintain.
Whilst I can see the benefit of this, is it really a concern?
The key benefit of async on the server is scalability. Simply put, async tasks scale far better than threads.
#Joshua's comment is key regarding the memory; a thread takes a significant amount of memory (and don't forget the kernel-mode stack which cannot be paged out), while an async request literally only takes a few hundred bytes.
There's also bursting to consider. The .NET threadpool has a limited injection rate, so unless you set your minWorkerThread count to a value much higher than you normally need, then when you get a burst of traffic some requests will 503 before .NET can spin up enough threads to handle them. async keeps your threads free (as much as possible) so it handles bursting traffic better.
A x64 architecture has 30000+ threads in the Thread Pool so unless you have that many concurrent users on your website is async really required?
#Joshua is again correct when he points out that you're probably thinking of a request queue limit (which defaults to 1000 for the IIS queue and 5000 for the ASP.NET request limit). It's important to note that once this queue is filled (during bursty traffic), new requests are rejected with 503.
Even if you have that many concurrent users without caching I'm pretty sure SQL Server will fall over with that many requests?
Ah, now that's another question entirely.
I'm giving a talk at ThatConference 2013 specifically on async servers. One part of that talk is situations where async doesn't help (my Twitter update).
There's an excellent blog post here that takes the position that asynchronous db calls are just not worth the effort. It's important to note the assumptions in this post:
At the time that post was written, asynchronous web servers were difficult. These days we have async and more and more libraries are offering asynchronous APIs (e.g., Entity Framework).
The architecture assumes a single web server with a single SQL Server backend. This was a very common setup traditionally, but is quickly changing today.
Where async servers really shine is when your backend can also scale. E.g., a web service, Azure SQL, NoSQL cluster, etc. Example: I'm writing an MVC/WebAPI server that uses Azure SQL and Storage for its backend (for all practical purposes, I can act like they have infinite scalability); in that case, I'm going to make my server async. In situations like this, you can scale your server 10x or more by using async.
But if you just have a single SQL Server backend (and have no plans to change to Azure SQL), then there's no point in making your web server async because you're limited by your backend anyway.
When long operations can be efficiently executed in parallel. For instance, you have to execute two SQLs and load three pictures - do all five operations as async and await them all. In this case the overall time will be the longest duration of five operations, but not the sum of the durations.
Pre-fetch. If you can predict (with good probability) what user will do (e.g. almost certainly, (s)he will want to see the details...) you may start preparing the next page (frame, window) while user's reading the previous.
where did you get 30000 from. i dont remember exactly but I think Asp.net uses 12 x number of cores threads.
I have to use async, when operation take too long time (upload, export, processing) and user have to know about progress.
You need async in following scenarios
1) When you are performing a very long operation and you don't want to freeze your UI.
2) When you designed some task that needs to be completed in background.
For example, You are rendering images from database. But you don't want your page to be freeze at that time async is really helpful.
Sometimes there is a lot that needs to be done when a given Action is called. Many times, there is more that needs to be done than what needs to be done to generate the next HTML for the user. In order to make the user have a faster experience, I want to only do what I need to do to get them their next view and send it off, but still do more things afterwards. How can I do this, multi-threading? Would I then need to worry about making sure different threads don't step on each others feet? Is there any built in functionality for this type of thing in ASP.NET MVC?
As others have mentioned, you can use a spawned thread to do this. I would take care to consider the 'criticality' of several edge cases:
If your background task encounters an error, and fails to do what the user expected to be done, do you have a mechanism of report this failure to the user?
Depending on how 'business critical' the various tasks are, using a robust/resilient message queue to store 'background tasks to be processed' will help protected against a scenario where the user requests some action, and the server responsible crashes, or is taken offline, or IIS service is restarted, etc. and the background thread never completes.
Just food for though on other issues you might need to address.
How can I do this, multi-threading?
Yes!
Would I then need to worry about making sure different threads don't step on each others feet?
This is something you need to take care of anyway, since two different ASP.NET request could arrive at the same time (from different clients) and be handled in two different worker threads simultaneously. So, any code accessing shared data needs to be coded in a thread-safe way anyway, even without your new feature.
Is there any built in functionality for this type of thing in ASP.NET MVC?
The standard .net multi-threading techniques should work just fine here (manually starting threads, or using the Task features, or using the Async CTP, ...).
It depends on what you want to do, and how reliable you need it to be. If the operaitons pending after the response was sent are OK to be lost, then .Net Async calls, ThreadPool or new Thread are all going to work just fine. If the process crashes the pending work is lost, but you already accepted that this can happen.
If the work requires any reliable guarantee, for instance the work incurs updates in the site database, then you cannot use the .Net process threading, you need to persist the request to do the work and then process this work even after a process restart (app-pool recycle as IIS so friendly calls them).
One way to do this is to use MSMQ. Other way is to use the a database table as a queue. The most reliable way is to use the database activation mechanisms, as described in Asynchronous procedure execution.
You can start a background task, then return from the action. This example is using the task Parallel Library, found in .NET 4.0:
public ActionResult DoSomething()
{
Task t = new Task(()=>DoSomethingAsynchronously());
t.Start();
return View();
}
I would use MSMQ for this kind of work. Rather than spawning threads in an ASP.NET application, I'd use an Asynchronous out of process way to do this. It's very simple and very clean.
In fact I've been using MSMQ in ASP.NET applications for a very long time and have never had any issues with this approach. Further, having a different process (that is an executable in a different app domain) do the long running work is an ideal way to handle it since your web application is no being used to do this work. So IIS, the threadpool and your web application can continue to do what they need to, while other processes handle long running tasks.
Maybe you should give it a try: Using an Asynchronous Controller in ASP.NET MVC