How to kill all child threads in C#? - c#

Suppose, I have a server application. This application polls for incoming clients in a separate child thread. Also, sub-threads are spawned for each of the incoming clients to service their requests separately.
I want to kill all threads except the Server application's main thread, say, by clicking a button.
How can I do that?

This application polls for incoming clients in a separate child thread.
Okay, so it seems you are the one creating those threads. Can't you just keep a list of threads and then call Abort on them all? That seems the most straight-forward approach. (But be aware of some caveats)
If that is not possible, you could get a list of all threads, filter them on their name or some other characteristic, and Abort those.
A better approach is to use tasks, which are far better manageable than threads. You might want to read up on TPL.

Related

ThreadPool.QueueUserWorkItem uses ASP.Net

In Asp.Net for creating a huge pdf report iam using "ThreadPool.QueueUserWorkItem", My requirement is report has to be created asynchronously , and i do not want to wait for the Response. I plan to achieve it through below code
protected void Button1_Click(object sender, EventArgs e)
{
ThreadPool.QueueUserWorkItem(report => CreateReport());
}
public void CreateReport()
{
//This method will take 30 seconds to finish it work
}
My question is ThreadPool.QueueUserWorkItem will create a new thread from Asp.Net worker process or some system thread. Is this a good approach ?, I may have 100 of concurrent users accessing the web page.
The QueueUserWorkItem() method utilizes the process's ThreadPool which automatically manages a number of worker-threads. These threads are assigned a task, run them to completion, then are returned to the ThreadPool for reuse.
Since this is hosted in ASP.NET the ThreadPool will belong to the ASP.NET process.
The ThreadPool is a very good candidate for this type of work; as the alternative of spinning up a dedicated thread is relatively expensive. However, you should consider the following limitations of the ThreadPool as well:
The ThreadPool is used by other aspects of .NET, and provides a limited number of threads. If you overuse it there is the possibility your tasks will be blocked waiting for others to complete. This is especially a concern in terms of scalability--however it shouldn't dissuade you from using the ThreadPool unless you have reason to believe it will be a bottleneck.
The ThreadPool tasks must be carefully managed to ensure they are returned for reuse. Unhandled exceptions or returns from a background thread will essentially "leak" that thread and prevent it from being reused. In these scenarios the ThreadPool may effectively lose it's threads and cause a serious slowdown or halt of the process.
The tasks you assign to the ThreadPool should be short-lived. If your processing is intensive then it's a better idea to provide it with a dedicated thread.
All these topics relate to the simple concept that the ThreadPool is intended for small tasks, and for it's threads to provide a cost-saving to the consuming code by being reused. Your scenario sounds like a reasonable case for using the ThreadPool--however you will want to carefully code around it, and ensure you run realistic load-tests to determine if it is the best approach.
The thread pool will manage the number of active threads as needed. Once a thread is done with a task it continues on the next queued task. Using the thread pool is normally a good way to handle background processing.
When running in an ASP.NET application there are a couple of things to be aware of:
ASP.NET applications can be recycled for various reasons. When this happens all queued work items are lost.
There is no simple way to signal back to the client web browser that the operation completed.
A better approach in your case might be to have a WCF service with a REST/JSON binding that is called by AJAX code on the client web page for doing the heavy work. This would give you the possibility to report process and results back to the user.
In addition to what Anders Abel has already laid out, which I agree with entirely, you should consider that ASP.NET also uses the thread pool to respond to requests, so if you have long running work like this using up a thread pool thread, it is technically stealing from the resources which ASP.NET is able to use to fulfill other requests anyway.
If you were to ask me how best to architect it I would say you dispatch the work to a WCF service using one way messaging over the MSMQ transport. That way it is fast to dispatch, resilient to failure and processing of the requests on the WCF side can be more tightly controlled because the messages will just sit on the queue waiting to be processed. So if your server can only create 10 PDFs at a time you would just set the maxConcurrentCalls for the WCF service to 10 and it will only pull a maximum of 10 messages off the queue at once. Also, if your service shuts down, when it starts up it will just begin processing again.

How can I use threads to run database queries in XNA?

I'm currently developing a project with XNA that is pulling information (ID, name, file location, etc) about each of my objects (each object will be displayed on screen) from a local SQL database.
I'd like to run my database queries on a separate thread so the rendered screen doesn't freeze if the database hangs or some other unforeseen event occurs. I'm using XNA 4.0 and the application will only be running on windows. Is this possible, and if so, how?
There are a number of options available. Generally speaking you need the query to run in a separate thread. You can use
Thread pool
QueueUserWorkItem
Tasks
Background worker
Async calls to the database
Parallel invoke
Manually created threads here and here
I would start with thread pooling and see how that works, dedicated manual threads are not that robust in terms of memory management and reuse.
Not to do it at all. Seriously. There are good reasons for using threads, but your reasons are bogus:
the rendered screen doesn't freeze if the database hangs or some other unforeseen event occur
Databases dont hang and unforseen events are unforseen events. How you can cope with the database not answering for 3 minutes, for example? Show a screen with objects that are unknown?
How do you mean "best"? There are a lot of ways to use threads and they all have strengths and weaknesses.
Declaring a new thread explicitly and starting it gives you the most direct control over the execution state of that thread:
var myDbThread = new Thread(()=>myDbRepo.GetRecordById<MyEntity>(idString));
myDbThread.Start();
Now, as long as you have a reference to myDbThread, you can abort it, pause it, join on it, etc. BUT, with control comes responsibility; you have to manage the threads you create yourself.
For most parallel tasks, using the ThreadPool is recommended. However, you lose some of the control:
Action myDbLambda = () => myEntityProperty = myDbRepo.GetRecordById<MyEntity>(idString);
var asyncResult = myDbLambda.BeginInvoke();
Once asyncResult.IsComplete returns true, myEntityProperty has the value. You can also architect it as a Func, and use a callback to set the value (this is recommended). The Asynchronous Model is built in to the BeginInvoke()/EndInvoke() method pair, and many exceptions like timeouts are expected by the ThreadPool, which will simply restart the timed-out thread. However, you can't "give up" and terminate a ThreadPool thread, "joining" on a ThreadPool thread is a little trickier, and if you're launching a lot of threads, the ThreadPool will start them in 250ms intervals which may not be the best use of processor.
There are many ways to use the ThreadPool; before delegates became even more important to .NET programming in v3.5, ThreadPool.QueueUserWorkItem was the main method. Now, as I said, delegates have BeginInvoke and EndInvoke methods allowing you to kick off background processes with the asynchronous model built in behind the scenes. In WinForms/WPF, you can also create BackgroundWorker components which are event-driven, allowing you to monitor progress and completion in a GUI element.
One thing to be aware of; it is virtually never a good idea to use background threads in ASP.NET. Unless you really know what you're doing, best-case you won't get the results of the behavior you sent to the worker thread, and worst-case you can crash your site trying.

How can I run a piece of code in the main thread from a different thread?

I am using the class HttpListener as a web server. This server runs on a different thread.
At some point this server needs to run some code but it needs to be executed on the main thread. Is there an easy way of doing that?
Thanks!
The bigger question is:
Why do you need to run it on the parent thread? Is it UI Code modifying the UI? Do you need to be within that thread's context to gaurantee thread saftey?
It might be worth stepping back and re-evaluating your threading model, you may be trying to do things in the wrong place.
I Suggest you read This Excelent Free E-Book on C# Threading and learn about the alternate ways of inter-thread communication and look into the Dispatcher if you're using WPF, as it will help delegate events back to the UI Thread if that's what your intent is.
Quick & Dirty Solution Not really the best way
There's any number of ways to approach this, the simplest would probably to have a list of delegates to execute on the main thread. Each time your main thread spins, you lock the collection (unless you're using the multi-threaded collections) and copy out the delegates & clear the collection and release the lock.
Then you simply run them on the main thread.
The problem you'll run into is if you're using blocking on the main thread, your spin cycle will not pass across your delegates till your blocking stops. So if you're say, blocking while you wait for connections, your code will not run till a new person connects.
You could put the server's listen port on it's own thread to solve this.
To do something on the main thread, you will possibly want to inject it via Invoke(), or in the main loop will have some queue of things to do that will be injected from the 'other' threads, in this case HttpListener.
Your example seems similar to mine, where I have 300 threads handling stream ripping, and they are all 'calling' main thread by putting the string messages into the queue for it. It works like a charm. However, when I did try (I dared, just to see what will happen) to Invoke() from at least 30-ish threads to the main message loop, it was weird, to say the least.
Best: use simple Queue< something >, and enqueue it from the other thread, then dequeue it from the UI thread.

Threadpool, order of execution and long running operations

I have a need to create multiple processing threads in a new application. Each thread has the possibility of being "long running". Can someone comment on the viability of the built in .net threadpool or some existing custom threadpool for use in my application?
Requirements :
Works well within a windows service. (queued work can be removed from the queue, currently running threads can be told to halt)
Ability to spin up multiple threads.
Work needs to be started in sequential order, but multiple threads can be processing in parallel.
Hung threads can be detected and killed.
EDIT:
Comments seem to be leading towards manual threading. Unfortunately I am held to 3.5 version of the framework. Threadpool was appealing because it would allow me to queue work up and threads created for me when resources were available. Is there a good 3.5 compatable pattern (producer/consumer perhaps) that would give me this aspect of threadpool without actually using the threadpool?
Your requirements essentially rule out the use of the .NET ThreadPool;
It generally should not be used for long-running threads, due to the danger of exhausting the pool.
It does work well in Windows services, though, and you can spin up multiple threads - limited automatically by the pool's limits.
You can not guarantee thread starting times with the thread pool; it may queue threads for execution when it has enough free ones, and it does not even guarantee they will be started in the sequence you submit them.
There are no easy ways to detect and kill running threads in the ThreadPool
So essentially, you will want to look outside the ThreadPool; I might recommend that perhaps you might need 'full' System.Threading.Thread instances just due to all of your requirements. As long as you handle concurrency issues (as you must with any threading mechanism), I don't find the Thread class to be all that difficult to manage myself, really.
Simple answer, but the Task class (Fx4) meets most of your requirements.
Cancellation is cooperative, ie your Task code has to check for it.
But detecting hung threads is difficult, that is a very high requirement anyway.
But I can also read your requirements as for a JobQueue, where the 'work' consists of mostly similar jobs. You could roll your own system that Consumes that queue and monitors execution on a few Threads.
I've done essentially the same thing with .Net 3.5 by creating my own thread manager:
Instantiate worker classes that know how long they've been running.
Create threads that run a worker method and add them to a Queue<Thread>.
A supervisor thread reads threads from the Queue and adds them to a Dictionary<int, Worker> as it launches them until it hits its maximum running threads. Add the thread as a property of the Worker instance.
As each worker finishes it invokes a callback method from the supervisor that passes back its ManagedThreadId.
The supervisor removes the thread from the Dictionary and launches another waiting thread.
Poll the Dictionary of running workers to see if any have timed out, or put timers in the workers that invoke a callback if they take too long.
Signal a long-running worker to quit, or abort its thread.
The supervisor invokes callbacks to your main thread to inform of progress, etc.

Server multithreading overkill?

I'm creating a server-type application at the moment which will do the usual listening for connections from external clients and, when they connect, handle requests, etc.
At the moment, my implementation creates a pair of threads every time a client connects. One thread simply reads requests from the socket and adds them to a queue, and the second reads the requests from the queue and processes them.
I'm basically looking for opinions on whether or not you think having all of these threads is overkill, and importantly whether this approach is going to cause me problems.
It is important to note that most of the time these threads will be idle - I use wait handles (ManualResetEvent) in both threads. The Reader thread waits until a message is available and if so, reads it and dumps it in a queue for the Process thread. The Process thread waits until the reader signals that a message is in the queue (again, using a wait handle). Unless a particular client is really hammering the server, these threads will be sat waiting. Is this costly?
I'm done a bit of testing - had 1,000 clients connected continually nagging - the server (so, 2,000+ threads) and it seemed to cope quite well.
I think your implementation is flawed. This kind of design doesn't scale because creating threads is expensive and there is a limit on how many threads can be created.
That is the reason that most implementations of this type use a thread pool. That makes it easy to put a cap on the maximum amount of threads while easily managing new connections and reusing the threads when the work is finished.
If all you are doing with your thread is putting items in a queue, then use the
ThreadPool.QueueUserWorkItem method to use the default .NET thread pool.
You haven't given enough information in your question to specify for definite but perhaps you now only need one other thread, constantly running clearing down the queue, you can use a wait handle to signal when something has been added.
Just make sure to synchronise access to your queue or things will go horribly wrong.
I advice to use following patter. First you need thread pool - build in or custom. Have a thread that checks is there something available to read, if yes it picks Reader thread. Then reading thread puts into queue and then thread from pool of processing threads will pick it. it will minimize number of threads and minimize time spend in waiting state

Categories

Resources