ASP.NET long running task. Thread is being aborted exception - c#

An ASP.NET 3.5 webapp has to start several tasks that takes hours to complete. For obvious reasons the pages which starts these tasks cannot wait for them to finish nor will anyone want to wait that long to get a response, so the tasks must be asynchronous.
There is a Helper class to handle all of these long running tasks. The main method that schedules and executes these tasks is currently the following:
public static bool ScheduleTask(TaskDescriptor task, Action action)
{
bool notAlreadyRunning = TasksAsync.TryAdd(task);
if (notAlreadyRunning)
{
Thread worker = null;
worker = new Thread(() =>
{
try { action(); }
catch(Exception e)
{
Log.LogException(e, "Worker");
}
TasksAsync.RemoveTask(task);
workers.Remove(worker);
});
workers.Add(worker);
worker.Start();
}
return notAlreadyRunning;
}
On earlier implementations we've used the ThreadPool.QueueUserWorkItem approach but the result has always been the same: after aprox. 20-30 mins a Thread was being aborted exception is thrown.
Does anyone know why is this happening? or how can it be prevented?
More Info:
IIS standard configuration.
Tasks could be anything, querys to a database and/or IO operations etc.
UPDATE: Decisions
Thank you all for your responses. Now I don´t know which question to mark as answer. All of them are valid and are possible solutions to this problem. Will wait for today and mark as answer the answer with the most up votes, in case of a draw I will choose the first shown answer, typically they are ordered by most relevance.
For anyone who want´s to know the solution I choose, again due to time restrictions, was to change the IIS recycling configuration, But what I consider to be the ideal solution, based on my research and of course the answers below, is to create a "Worker Service" and use a communication solution between the ASP.NET App and the new "Worker Service" to coordinate the long running work to be done.

You can start the long-running process in its own application domain.
In the past, when I've needed this capability, I create a Windows Service for this purpose. If you use WCF to connect to it, it doesn't even have to run on the IIS machine at all; you can run it on any machine on the network.

Chances are you can get this working, by upping the timeout, using a different app pool or a variety of other hacks, but your best bet is going to be to decouple the long running task from the ui and asp.net completely, and use either a service (wouldn't recommend it) or a scheduled task that polls for work to do; personally I would use something like aws sqs/sns to keep track of work to be done and a scheduled task in windows server that checks for things todo at whatever frequency make sense. The only thing the ui/asp.net then needs to do is log that fact that something needs to be done, not actually do it.
Another benefit of this message based approach is should the long running process become so long running, or so overworked, you'd have the opportunity to add more worker tasks or servers to complete those requests.
Perhaps more than you can implement for your immediate problem, but something to consider for a better long term solution.

Related

Can not execute subtask of a task in IIS 7

I'm developing a system which works like following structure
Level1-sub-tasks are created by task via TaskFactory of .net 4.0, and Level2-sub-tasks are created by level1-sub-tasks similarly.
All tasks work file in debug, but when I deploy that on IIS server, all level2 sub tasks stops executing.
I'm not an experienced developer of iis, does it has any limitation for multi-level thread(task)?
If you do spawn new threads while handling a request, make sure they have all finished before returning a response. You may think "I'm gonna return to the user asap, and leave a thread in the background saving stuff to the database." This is dangerous, mainly because the AppDomain may be recycled, aborting your background threads.
Alternatively, use the IRegisteredObject interface to tell asp.net you're doing some work in the background. The approach is detailed here: http://haacked.com/archive/2011/10/16/the-dangers-of-implementing-recurring-background-tasks-in-asp-net.aspx/
Do note that this is not fail-proof though, and that there are better approaches to this - it all depends on what you're trying to achieve. In most cases (e.g., sending out email notifications), the best solution is to schedule a task (in a separate process) that dequeues work items periodically.

Thread IIS wont start C#

Before I go into this question, I d like to say that, I have read the threading modeling for IIS 7, 7, 7.5 so I know how threads are handled.
My application starts a thread when a request comes in.
We can assume the threads as cron jobs.
GET request comes in, Lets say /Handle
in the scope of /Handle I start a thread from that action , THREAD A
I am not long polling the GET request, so it returns back to the
user right away. So thread handling the GET is returned to the POOL
Then I wait until the thread A completes to do anything else.
So No threads are running as far as I know. Both the thread that was
handling the GET and THREAD A has exited.
I make the same request a few times SEQUANTIALLY. I always wait for both threads to exit.
After a while `Thread.Start()1 function blocks.
Questions :
I know that the threads are returning and I am not leaking any ghost threads.
Why does IIS not allowing me to start new threads after a like 4-5 requests. ?
What is the right way to create application thread for the user application.
If I said Thread t= new Thread(), does this allocate a thread from the pool that handled the GETS or CLR?
I am using IIS7.
I know that I exit each thread, I call a JOIN on THREAD A , and it never blocks, and at this point I am not worried about scalability so I always have ONE user hitting the server sequentially.
So to answer your question "What is the right way to create application thread for the user application?" (i.e. ASP.NET application) - You have many options:
run on the ASP.NET thread, without any threading - ASP.NET will still handle more then one request
use async calls (see async operations) for long running operations
use CLR ThreadPool
send a message to some other server (e.g. using WCF services), so the long running processing takes place outside the Web server.
You mentioned reading about threading in ASP.NET, but in "MSDN: Performing Asynchronous Work, or Tasks, in ASP.NET Applications" there's a relatively short description of how threading in ASP.NET works. At the end of the post, there's a question:
"Q4: Should I create my own threads (new Thread)?" and the answer for that question is "A4) Please don’t (create new Threads). Or to put it a different way, no!!! (...) ".
And to answer your question: "Why does IIS not allowing me to start new threads after a like 4-5 requests"?
That's really a strange behaviour, maybe IIS knows that your are doing it wrong ;)

Windows service to do job every 6 hours

I've got a windows service with only two methods - one private method DoWork(), and an exposed method which calls DoWork method. I want to achieve the following:
Windows service runs DoWork() method every 6 hours
An external program can also invoke the exposed method which calls DoWork() method. If the service is already running that method called from the service, DoWork() will again be invoked after the current method ends.
What's the best approach to this problem? Thanks!
An alternative approach would be to make use of a console application which can be scheduled by Windows task scheduler to run every 6 hours. In that case you don't waste resources to keep the Windows service running the entire time but only consume resources when needed.
For your second question: when you take the console app approach you can have it called by making use of Process.Start for example.
If the purpose of your application is only to run a specific task every six hours, you might be better off creating a command line application and creating a scheduled task that Windows runs automatically. Obviously, you could then manually start this application.
If you're still convinced you need a service (and honestly, from what I've seen so far, it sounds like you don't), you should look into using a Timer, but choose your timer carefully and read this article to get a better understanding of the timers built into .NET (Hint: Pay close attention to System.Timers.Timer).
To prevent reentry if another method tries to call DoWork() while the process is in the middle of performing its operation, look into using either a Mutex or a Semaphore.
there are benefits and drawbacks either way. my inclination with those options is to choose the windows service because it makes your deployment easier. scheduling things with the windows task scheduler is scriptable and can be automated for deployment to a new machine/environment, but it's still a little more nonstandard than just deploying and installing a windows service. you also need to make sure with task scheduler it is running under an account that can make the webservice call and that you aren't going to have problems with passwords expiring and your scheduled tasks suddenly not running. with a windows service, though, you need to have some sort of checking in place to make sure it is always running and that if it restarts that you don't lose hte state that lets it know when it should run next.
another option you could consider is using nservicebus sagas. sagas are really intended for more than just scheduling tasks (they persist state for workflow type processes that last for more than the duration of a single request/message), but they have a nice way of handling periodic or time-based processes (which is a big part of long running workflows). in that a saga can request that it get back a message from a timeout manager at a time it requests. using nservicebus is a bigger architectural question and probably well beyond the scope of what you are asking here, but sagas have become how i think about periodic processes and it comes with the added benefit of being able to manage some persistent state for your process (which may or may not be a concern) and gives you a reason to think about some architectural questions that perhaps you haven't considered before.
you can create a console application for your purpose. You can schedule the application to run every 6 hours. The console will have a default method called on application start. you can call your routine from this method. Hope this helps!!

ThreadPool.QueueUserWorkItem uses ASP.Net

In Asp.Net for creating a huge pdf report iam using "ThreadPool.QueueUserWorkItem", My requirement is report has to be created asynchronously , and i do not want to wait for the Response. I plan to achieve it through below code
protected void Button1_Click(object sender, EventArgs e)
{
ThreadPool.QueueUserWorkItem(report => CreateReport());
}
public void CreateReport()
{
//This method will take 30 seconds to finish it work
}
My question is ThreadPool.QueueUserWorkItem will create a new thread from Asp.Net worker process or some system thread. Is this a good approach ?, I may have 100 of concurrent users accessing the web page.
The QueueUserWorkItem() method utilizes the process's ThreadPool which automatically manages a number of worker-threads. These threads are assigned a task, run them to completion, then are returned to the ThreadPool for reuse.
Since this is hosted in ASP.NET the ThreadPool will belong to the ASP.NET process.
The ThreadPool is a very good candidate for this type of work; as the alternative of spinning up a dedicated thread is relatively expensive. However, you should consider the following limitations of the ThreadPool as well:
The ThreadPool is used by other aspects of .NET, and provides a limited number of threads. If you overuse it there is the possibility your tasks will be blocked waiting for others to complete. This is especially a concern in terms of scalability--however it shouldn't dissuade you from using the ThreadPool unless you have reason to believe it will be a bottleneck.
The ThreadPool tasks must be carefully managed to ensure they are returned for reuse. Unhandled exceptions or returns from a background thread will essentially "leak" that thread and prevent it from being reused. In these scenarios the ThreadPool may effectively lose it's threads and cause a serious slowdown or halt of the process.
The tasks you assign to the ThreadPool should be short-lived. If your processing is intensive then it's a better idea to provide it with a dedicated thread.
All these topics relate to the simple concept that the ThreadPool is intended for small tasks, and for it's threads to provide a cost-saving to the consuming code by being reused. Your scenario sounds like a reasonable case for using the ThreadPool--however you will want to carefully code around it, and ensure you run realistic load-tests to determine if it is the best approach.
The thread pool will manage the number of active threads as needed. Once a thread is done with a task it continues on the next queued task. Using the thread pool is normally a good way to handle background processing.
When running in an ASP.NET application there are a couple of things to be aware of:
ASP.NET applications can be recycled for various reasons. When this happens all queued work items are lost.
There is no simple way to signal back to the client web browser that the operation completed.
A better approach in your case might be to have a WCF service with a REST/JSON binding that is called by AJAX code on the client web page for doing the heavy work. This would give you the possibility to report process and results back to the user.
In addition to what Anders Abel has already laid out, which I agree with entirely, you should consider that ASP.NET also uses the thread pool to respond to requests, so if you have long running work like this using up a thread pool thread, it is technically stealing from the resources which ASP.NET is able to use to fulfill other requests anyway.
If you were to ask me how best to architect it I would say you dispatch the work to a WCF service using one way messaging over the MSMQ transport. That way it is fast to dispatch, resilient to failure and processing of the requests on the WCF side can be more tightly controlled because the messages will just sit on the queue waiting to be processed. So if your server can only create 10 PDFs at a time you would just set the maxConcurrentCalls for the WCF service to 10 and it will only pull a maximum of 10 messages off the queue at once. Also, if your service shuts down, when it starts up it will just begin processing again.

How can I send the HTTP response back to the user but still do more things on the server after that?

Sometimes there is a lot that needs to be done when a given Action is called. Many times, there is more that needs to be done than what needs to be done to generate the next HTML for the user. In order to make the user have a faster experience, I want to only do what I need to do to get them their next view and send it off, but still do more things afterwards. How can I do this, multi-threading? Would I then need to worry about making sure different threads don't step on each others feet? Is there any built in functionality for this type of thing in ASP.NET MVC?
As others have mentioned, you can use a spawned thread to do this. I would take care to consider the 'criticality' of several edge cases:
If your background task encounters an error, and fails to do what the user expected to be done, do you have a mechanism of report this failure to the user?
Depending on how 'business critical' the various tasks are, using a robust/resilient message queue to store 'background tasks to be processed' will help protected against a scenario where the user requests some action, and the server responsible crashes, or is taken offline, or IIS service is restarted, etc. and the background thread never completes.
Just food for though on other issues you might need to address.
How can I do this, multi-threading?
Yes!
Would I then need to worry about making sure different threads don't step on each others feet?
This is something you need to take care of anyway, since two different ASP.NET request could arrive at the same time (from different clients) and be handled in two different worker threads simultaneously. So, any code accessing shared data needs to be coded in a thread-safe way anyway, even without your new feature.
Is there any built in functionality for this type of thing in ASP.NET MVC?
The standard .net multi-threading techniques should work just fine here (manually starting threads, or using the Task features, or using the Async CTP, ...).
It depends on what you want to do, and how reliable you need it to be. If the operaitons pending after the response was sent are OK to be lost, then .Net Async calls, ThreadPool or new Thread are all going to work just fine. If the process crashes the pending work is lost, but you already accepted that this can happen.
If the work requires any reliable guarantee, for instance the work incurs updates in the site database, then you cannot use the .Net process threading, you need to persist the request to do the work and then process this work even after a process restart (app-pool recycle as IIS so friendly calls them).
One way to do this is to use MSMQ. Other way is to use the a database table as a queue. The most reliable way is to use the database activation mechanisms, as described in Asynchronous procedure execution.
You can start a background task, then return from the action. This example is using the task Parallel Library, found in .NET 4.0:
public ActionResult DoSomething()
{
Task t = new Task(()=>DoSomethingAsynchronously());
t.Start();
return View();
}
I would use MSMQ for this kind of work. Rather than spawning threads in an ASP.NET application, I'd use an Asynchronous out of process way to do this. It's very simple and very clean.
In fact I've been using MSMQ in ASP.NET applications for a very long time and have never had any issues with this approach. Further, having a different process (that is an executable in a different app domain) do the long running work is an ideal way to handle it since your web application is no being used to do this work. So IIS, the threadpool and your web application can continue to do what they need to, while other processes handle long running tasks.
Maybe you should give it a try: Using an Asynchronous Controller in ASP.NET MVC

Categories

Resources