Do Entity Framework async methods consume ThreadPool threads? - c#

I usually use many EF Core async methods in my web application like this:
await db.Parents.FirstOrDefaultAsync(p => p.Id == id);
As we know, initial number of threads in ThreadPool by default is limited to number of CPU logical cores. Also user requests are handled by threads in ThreadPool.
Should I worry about handling user requests or performance issues due to many async calls in my application?

Should I worry about handling user requests or performance issues due to many async calls in my application?
EF Core provides an Async query interface for repositories. Whether it's async-all-the-way, or whether certian methods block thread pool threads is dependent on the EF provider. SQLServer's SqlClient has task-based Async methods that don't block threads. Most other providers do too. But for instance for the EF in-memory provider, or perhaps the SQLite provider it may be async-over-sync, either completing synchronously and returning a completed Task, or blocking a thread pool thread.
So EF normally won't block your threads. And when you make an Async call to the database it frees your application's thread to do more work. Like handle additional requests. If you have too many concurrent requests to your database, each request will start to take more time.
When this happens you need to have a mechanism to slow down the rate of new requests to the database, otherwise you you'll get into a bad state. EG where the database server is has 2000 running requests, most of which are on behalf of clients who've given up and timed out. And new requests aren't handled in a timely manner because of all the old requests.
Generally throughput increases as you add concurrency up to a point, but beyond that point overall throughput decreases, sometimes drastically. Something like this:
It’s up to you to limit overall concurrency to prevent severe degradation in throughput. It’s better to fail some requests early (eg with an HTTP 503) than accept them all and not complete any within your SLA.
One of the benefits of using synchronous database access is that it occupies an application thread for the duration of the database interaction, automatically adding backpressure to the request flow. Having a request have to wait for a thread pool thread when all of the thread pool threads are busy is actually a good thing. When you go all async this control goes away and you need to think about replacing it.
ASP.NET Core currently has no built-in throttling. Your web server host may have some, and, for instance, SqlConnection's connection pool limit serves to limit the number of concurrent requests per application instance. But you've got to have something that allows you to handle a surge in request volume in an orderly fashion.

Related

When should I use Task.Run in Asp.Net Core?

I'm of the belief that you should never have to use Task.Run for any operation in .net core web context. If you have a long running task or CPU intensive task, you can offload it to a message queue for async processing. If you have a sync operation, that has no equivalent async method, then offloading to background thread does nothing for you, it actually makes in slightly worse.
What am I missing? Is there a genuine reason to use Task.Run in a high throughput server application?
Some quick examples:
A logging system where each worker thread can write to a queue and a worker thread is responsible for dequeuing items and writing them to a log file.
To access an apartment-model COM server with expensive initialization, where it may be better to keep a single instance on its own thread.
For logic that runs on a timer, e.g. a transaction that runs every 10 minutes to update an application variable with some sort of status.
CPU-bound operations where individual response time is more important than server throughput.
Logic that must continue to run after the HTTP response has been completed, e.g. if the total processing time would otherwise exceed an HTTP response timeout.
Worker threads for system operations, e.g. a long running thread that checks for expired cache entries.
Just to backup your belief:
Do not: Call Task.Run and immediately await it. ASP.NET Core already runs app code on normal Thread Pool threads, so calling
Task.Run only results in extra unnecessary Thread Pool scheduling.
Even if the scheduled code would block a thread, Task.Run does not
prevent that.
This is the official recommendation/best practice from Microsoft. Although it doesn't point out something you might have missed, it does tell you that it is a bad idea and why.

How we can achieve performance by implementing async controller in asp.net MVC 6?

At what point is it advisable to use async controllers in ASP.NET MVC.
Is there any coding or performance costs involved?
MSDN recommends using it for long running processes, but I was just curious if it would it be beneficial if we used it as a complete replacement to normal controllers?
We are planning to use WCF services with our controller methods.
First, async is not synonymous with "performance". In fact, using async can actually decrease performance as there's a non-trivial amount of overhead involved in async.
What async does do is release threads back to the pool when they're in a wait-state. This means that your web server is given a higher threshold before it exhausts it's "max requests" or, in other words, runs out of free threads to handle new requests.
In a synchronous request, the thread is tied up for the entire request. If there's some period of waiting involved (network latency from an API call, etc.) it's holding on to that thread even though no work is actually being done. If you got hit with 1000 simultaneous requests (the typical out-of-the-box max requests for a web server), then each further request would be queued until one for the first 1000 threads was returned to the pool.
In an async request, as soon as the thread is waiting on something to happen (i.e. not doing work), it is given back to the pool, even though the original request it was serving has not yet completed. This allows a new request to be served. When the original task that forfeited the thread completes, a new thread is requested from the pool to continue servicing that request. This effectively gives your server a little breathing room when under load. Other than that, async does nothing, at least in the context of a request being served by a web server.
In general, using async is recommended, because even that little bit of breathing room it provides may mean the difference between your server handling load or falling down. However, you should gauge your usage of async to ensure that you're actually buying something worthwhile of the overhead it adds. For example, MVC 6 lets you do things like render partials asynchronously. If your server is equipped with an enterprise class 15,000 RPM hard drive or an SSD, though, the period of waiting the thread would experience would likely be so minuscule that the passing of the thread back and forth would actually take more time than the operation itself, run synchronously.
I would say that this topic is nicely covered on this post:
When should I use Async Controllers in ASP.NET MVC?
My opinion is that it's good to use async actions when you call async methods in it (like I/O operations), it's not especially bad when you make an async action without any async calls inside, but:
You will have a needless thread switch, not a big performance penalty, but not nice either
VS will warn you, that there is no await in your async action, which can lead to unnecessary Task.Run calls

Why are asynchronous calls to my database desireable?

I've written a server which interacts with an MSSQL database. It's currently written in .NET 4.0 and uses NHibernate as an ORM to retrieve information from the database. When reading about .NET 4.5 and the introduction of the async/await keywords I learned that, unfortunately, NHibernate does not have support for async/await .
I don't understand why issuing an async call to a database would be beneficial. Don't all the requests queue at the database level anyway? Wouldn't async just increase points of failure without improving anything?
In general, the benefit is that you are not blocking the currently executing thread while a possibly expensive (asynchronous) operation is run. In the context of a WPF / Windows Form application, this mean you are not blocking the UI Thread (if the request is originating from that thread) and your application remains responsive.
In the context of a web application (say IIS), this mean you are releasing a thread in the pool while you are awaiting for the result. Since you are not locking the thread, it can be reused to accept another request and results in better performance in terms of accepted connections (not necessarily time / request).
Don't all the requests queue at the database level anyway?
No. Read Understanding how SQL Server executes a query. Any database server worth the name will be able to run hundreds of requests concurrently. Serialization is necessary only if the requests are correlated (eg. you need the output of query 1 to pass as a parameter to query 2) or when operating under transaction constraints (only one statement can be active at any time within a transaction).
There are at least two major advantages of async calls:
resource usage. W/o considering anything else, just changing the programming model to an event driven async model will result in order of magnitude increase of throughput you app can drive. This, of course, applies to back end apps (eg. a web server), not to a client user driven app that will not be able to send anything more than what the one user initiates. Read the articles linked from High Performance Windows programs. This is also important to read, even though a bit dated: Asynchronous Pages in ASP.NET 2.0
overlapping requests. The synchronous model doe snot allow to issue a query to the back end until the current one completes. A lot of times the application has the info necessary (the params) to make two or more uncorrelated requests, but it simply can. Doing async calls allow the controlling thread to issue all the request is parallel, and resume after they all complete.
Neither .Net 4.5 Tasks not NHibernate have good support for async DB programming. Good old BeginExecuteXXX is much more powerful actually, although a bit arcane to program against.
NHibernate can support true async calls. I already implemented it on my own branch
https://github.com/ReverseBlade/nhibernate-core/tree/nh_4.5.1
You can check it out and compile. It is compiled against .net 4.5.1. It is compatible with standart nhibernate and passes all tests.
Then you can use things like .ToListAsync(); or GetAsync(), it will make true async calls.
If you need help you can write a comment. Good luck
Good news. NHibernate supports async/await out of the box since v 5.0
You may be confusing language features with design pattens; async is syntactic sugar to help you manage background tasks, while asynchronous tasks just mean that you're running two or more threads.
Just because NHibernate doesn't support async doesn't mean that you can't run asynchronously. This is very beneficial to the user because you don't want to freeze the UI while you're performing a (relatively) long-running query to a DB/service, especially if the server is bogged down.
I suppose you could count this as a point of failure, but really just a few areas:
Exceptions - You'd have this problem on one thread anyway, but you should gracefully handle any database errors that you'd encounter.
UI Management - You don't want to let the user interact with the UI in such a way as to trigger multiple queries, so you might disable a button, etc.
Result Handling - When the query is complete, you need to ensure that you marshal the data back to the UI thread. In C# this can be done via Invoke/BeginInvoke, though whether you're in WinForms or WPF determines the details.
EDIT:
Some sample skeleton code assuming WPF and at least .NET 4.0
Task.Factory.StartNew(() =>
{
using (var client = new dbClient())
{
// Perform query here
this.Dispatcher.BeginInvoke(new Action(() =>
{
// Set data source, etc, i.e.
this.Items = result;
}));
}
}).ContinueWith(ex => Logger.LogException(ex), TaskContinuationOptions.OnlyOnFaulted);
You say:
Don't all the requests queue at the database level anyway?
If by "queue" you mean "single-servicing queue" than the answer is no. SQL Server is a highly asynchronous and multi-threaded service that can service many, many queries simultaneously.
Even at a physical level, queueing (i.e. physical device servicing) is simultaneously split across the number of CPU cores, and the number of physical disks the make up the disk array.
So the reason to make asynchronous calls to SQL Server is to be able to leverage some of that multi-threading/multi-servicing capacity into your own service.

Does an asychronous web request in .Net help performance

What is the performance difference, if any, between a regular request in C# .net and an asychronous one?
Depends on your use case. While there is no immediate performance benefit from using async in simple scenarios, it can be crucial in more complex ones, and for scalability.
For instance, sending multiple requests to many servers is obviously best done in parallel, this can be handled using async.
Regarding scalability, consider a web application that uses sync web requests to communicate with slow external servers. Since IIS only allocates a limited amount of threads to serve requests from users, when the number of users grow, there is a risk that all user threads will be blocked while waiting for the external web requests. This means that some user requests will be rejected by IIS.
It depends, in asp.net you must use synchronous (until 4.5); but in a Windows Form or any other type of C# project though, it prevents the thread from being on hold. Calling the ASYNC complete event, would allow you to update the proper information without putting the UI or the Main thread on hold.
under asychronous mode, The server can only reply what you want, this will minimize the network triffic, then minimize the response time, improve the user experience.

ThreadPool.QueueUserWorkItem uses ASP.Net

In Asp.Net for creating a huge pdf report iam using "ThreadPool.QueueUserWorkItem", My requirement is report has to be created asynchronously , and i do not want to wait for the Response. I plan to achieve it through below code
protected void Button1_Click(object sender, EventArgs e)
{
ThreadPool.QueueUserWorkItem(report => CreateReport());
}
public void CreateReport()
{
//This method will take 30 seconds to finish it work
}
My question is ThreadPool.QueueUserWorkItem will create a new thread from Asp.Net worker process or some system thread. Is this a good approach ?, I may have 100 of concurrent users accessing the web page.
The QueueUserWorkItem() method utilizes the process's ThreadPool which automatically manages a number of worker-threads. These threads are assigned a task, run them to completion, then are returned to the ThreadPool for reuse.
Since this is hosted in ASP.NET the ThreadPool will belong to the ASP.NET process.
The ThreadPool is a very good candidate for this type of work; as the alternative of spinning up a dedicated thread is relatively expensive. However, you should consider the following limitations of the ThreadPool as well:
The ThreadPool is used by other aspects of .NET, and provides a limited number of threads. If you overuse it there is the possibility your tasks will be blocked waiting for others to complete. This is especially a concern in terms of scalability--however it shouldn't dissuade you from using the ThreadPool unless you have reason to believe it will be a bottleneck.
The ThreadPool tasks must be carefully managed to ensure they are returned for reuse. Unhandled exceptions or returns from a background thread will essentially "leak" that thread and prevent it from being reused. In these scenarios the ThreadPool may effectively lose it's threads and cause a serious slowdown or halt of the process.
The tasks you assign to the ThreadPool should be short-lived. If your processing is intensive then it's a better idea to provide it with a dedicated thread.
All these topics relate to the simple concept that the ThreadPool is intended for small tasks, and for it's threads to provide a cost-saving to the consuming code by being reused. Your scenario sounds like a reasonable case for using the ThreadPool--however you will want to carefully code around it, and ensure you run realistic load-tests to determine if it is the best approach.
The thread pool will manage the number of active threads as needed. Once a thread is done with a task it continues on the next queued task. Using the thread pool is normally a good way to handle background processing.
When running in an ASP.NET application there are a couple of things to be aware of:
ASP.NET applications can be recycled for various reasons. When this happens all queued work items are lost.
There is no simple way to signal back to the client web browser that the operation completed.
A better approach in your case might be to have a WCF service with a REST/JSON binding that is called by AJAX code on the client web page for doing the heavy work. This would give you the possibility to report process and results back to the user.
In addition to what Anders Abel has already laid out, which I agree with entirely, you should consider that ASP.NET also uses the thread pool to respond to requests, so if you have long running work like this using up a thread pool thread, it is technically stealing from the resources which ASP.NET is able to use to fulfill other requests anyway.
If you were to ask me how best to architect it I would say you dispatch the work to a WCF service using one way messaging over the MSMQ transport. That way it is fast to dispatch, resilient to failure and processing of the requests on the WCF side can be more tightly controlled because the messages will just sit on the queue waiting to be processed. So if your server can only create 10 PDFs at a time you would just set the maxConcurrentCalls for the WCF service to 10 and it will only pull a maximum of 10 messages off the queue at once. Also, if your service shuts down, when it starts up it will just begin processing again.

Categories

Resources