how to make the foreground thread wait for all background (child) threads to finish in C#? I need to get list of pending jobs from the queue (database), start a new thread to execute each of them and finally wait for all the child threads to finish. how to do that in C#? Thanks in advance.
You could store each launched thread in an array. Then when you need to wait for them all, call Join method on each thread in an array in a loop.
Thread child = new Thread(...);
Threads.Add(child);
child.Start()
...
foreach(Thread t in Threads)
{
t.Join();
}
HTH
Consider using ThreadPool. Most of what you want is already done. There is an example from Microsoft which does pretty much your entire task. Replace "fibonacci" with "database task" and it sounds like your problem.
Using dynamic data you can pass your object and the WaitHandle (ActionResetEvent) that lets you wait for all the background threads to finish without declaring an extra class:
static void Main(string[] args)
{
List<AutoResetEvent> areList = new List<AutoResetEvent>();
foreach (MyObject o in ListOfMyObjects)
{
AutoResetEvent are = new AutoResetEvent(false);
areList.Add(are);
ThreadPool.QueueUserWorkItem(DoWork, new { o, are });
};
Console.WriteLine("Time: {0}", DateTime.Now);
WaitHandle.WaitAll(areList.ToArray());
Console.WriteLine("Time: {0}", DateTime.Now);
Console.ReadKey();
}
static void DoWork(object state)
{
dynamic o = state;
MyObject myObject = (MyObject)o.o;
AutoResetEvent are = (AutoResetEvent)o.are;
myObject.Execute();
are.Set();
}
This is incomplete code, but ManualResetEvent works for you
var waitEvents = new List<ManualResetEvent>();
foreach (var action in actions)
{
var evt = new ManualResetEvent(false);
waitEvents.Add(evt);
ThreadPool.RegisterWaitForSingleObject(asyncResult.AsyncWaitHandle, TimeoutCallback, state, 5000, true);
}
if (waitEvents.Count > 0)
WaitHandle.WaitAll(waitEvents.ToArray());
Create a structure to keep track of your worker threads
private struct WorkerThreadElement
{
public IAsyncResult WorkerThreadResult;
public AsyncActionExecution WorkerThread;
}
You also need to keep track the total number of threads expected to be created and the number of threads that have currently completed
private int _TotalThreads = 0;
private int _ThreadsHandled = 0;
private List<WorkerThreadElement> _WorkerThreadElements = new List<WorkerThreadElement>();
Then create an autoreset handle in order to wait for thread completion.
// The wait handle thread construct to signal the completion of this process
private EventWaitHandle _CompletedHandle = new AutoResetEvent(false);
You also need a delegate to create new threads - There are multiple ways of doing this but i have chosen a simple delegate for the sake of this example
// Delegate to asynchronously invoke an action
private delegate void AsyncActionExecution();
Lets asume that the Invoke method is the entrance point that will create all threads and wait for their execution. So we have:
public void Invoke()
{
_TotalThreads = N; /* Change with the total number of threads expected */
foreach (Object o in objects)
{
this.InvokeOneThread();
}
// Wait until execution has been completed
_CompletedHandle.WaitOne();
// Collect any exceptions thrown and bubble them up
foreach (WorkerThreadElement workerThreadElement in _WorkerThreadElements)
{
workerThreadElement.WorkerThread.EndInvoke(workerThreadElement.WorkerThreadResult);
}
}
InvokeOneThread is the method used to create a single thread for one operation. Here we need to create a worker thread element and invoke the actual thread.
private void InvokeOneThread()
{
WorkerThreadElement threadElement = new WorkerThreadElement();
threadElement.WorkerThread = new AsyncActionExecution();
threadElement.WorkerThreadResult = threadElement.WorkerThread.BeginInvoke(actionParameters, InvokationCompleted, null);
_WorkerThreadElements.Add(threadElement);
}
Callback from thread completion
private object _RowLocker = new object();
/// <summary>
/// Increment the number of rows that have been fully processed
/// </summary>
/// <param name="ar"></param>
private void InvokationCompleted(IAsyncResult ar)
{
lock (_RowLocker)
{
_RowsHandled++;
}
if (_TotalThreads == _ThreadsHandled)
_CompletedHandle.Set();
}
Done
Related
Have those two code blocks the same effect when looking at the console?
Please note: Currently I am still using and bound to .NET 3.5.
First:
for(int i = 0; i<3;i++)
{
Console.WriteLine(i);
}
Second:
class Worker
{
static int i = 0;
static ManualResetEvent manualResetEvent = new ManualResetEvent(false);
static Object locky = new Object();
static void Work(Object workItem)
{
WaitHandle[] wait = new [] { manualResetEvent };
while (WaitHandle.WaitAny(wait))
{
lock (locky)
{
Console.WriteLine(i++);
}
}
}
}
// main:
Thread thread = new Thread(Worker.Work);
thread.Start();
for (int i=0;i<3;i++)
{
Worker.manualResetEvent.Set();
}
Will the waitHandle increase with every signal? Will the loop run until all signals are done?
Or will a signal be ignored when the thread is already working?
Can someone please bring some light into this?
Since you're using a ManualResetEvent, once you signal the event, it remains signaled until it's reset. Which means setting it once or three times will have the same effect.
This also means that the worker will go into an infinite loop because the event is never reset.
Also, you can't lock on value types. If you could, the int would be boxed and create a new object every time you lock on it - which means you'd be locking on a different object every single time, rendering the lock useless.
Have created a class which implements ThreadPool. The code is as below:
public sealed class PyeThreadPool :
IDisposable
{
private readonly object _lock = new object();
private readonly int _minThreadCount;
private readonly int _maxThreadCount;
private readonly Queue<Action> _queue = new Queue<Action>();
private int _totalThreadCount;
private int _waitingThreadCount;
private bool _disposed;
public PyeThreadPool(int minThreadCount, int maxThreadCount)
{
if (minThreadCount < 0)
throw new ArgumentOutOfRangeException("minThreadCount");
if (maxThreadCount < 1 || maxThreadCount < minThreadCount)
throw new ArgumentOutOfRangeException("maxThreadCount");
_minThreadCount = minThreadCount;
_maxThreadCount = maxThreadCount;
}
public void Dispose()
{
lock (_lock)
{
_disposed = true;
// if there are thread waiting, they should stop waiting.
if (_waitingThreadCount > 0)
Monitor.PulseAll(_lock);
}
}
/// <summary>
/// Executes an action in a parallel thread.
/// </summary>
public void RunParallel(Action action)
{
if (action == null)
throw new ArgumentNullException("action");
lock (_lock)
{
if (_disposed)
throw new ObjectDisposedException(GetType().FullName);
bool queued = false;
if (_waitingThreadCount == 0)
{
if (_totalThreadCount < _maxThreadCount)
{
_totalThreadCount++;
var thread = new Thread(_ThreadRun);
thread.Name = "Worker Thread";
thread.Start(action);
queued = true;
}
}
if (!queued)
{
_queue.Enqueue(action);
Monitor.Pulse(_lock);
}
}
}
private void _ThreadRun(object firstAction)
{
Action action = (Action)firstAction;
firstAction = null;
// we always start a new thread with an action, so we get it immediately.
// but, as we don't know what that action really holds in memory, we set
// the initial action to null, so after it finishes and a new action is get,
// we will let the GC collect it.
while (true)
{
action();
lock (_lock)
{
if (_queue.Count == 0)
{
// we started waiting, so new threads don't need to be created.
_waitingThreadCount++;
while (_queue.Count == 0)
{
if (_disposed)
return;
if (_totalThreadCount > _minThreadCount)
{
_totalThreadCount--;
_waitingThreadCount--;
return;
}
action = null;
Monitor.Wait(_lock);
}
// we finished waiting.
_waitingThreadCount--;
}
action = _queue.Dequeue();
// we just get a new action, and we will release the lock and return
// to the while, where the action will be executed.
}
}
}
}
I have tried to use this and the test code is as:
PyeThreadPool MyPool;
int x = 1;
protected void Page_Load(object sender, EventArgs e)
{
MyPool = new PyeThreadPool(4, 6);
}
void showMessage(string message)
{
TxtMessage.Text = message;
}
protected void BtnStartThread_Click(object sender, EventArgs e)
{
x++;
int arg = x;
MyPool.RunParallel(() =>
{
showMessage(arg.ToString());
});
}
Problem is:
(1) When I execute this either in debug or release mode I do not see the result in textbox, on the other hand I see the result when I step through. What am I missing here, why I can not see the output.
(2) The RunParallel method shows only one thread even if I have set maxcount to more than 1. Is there any code logic missing or is it because the test application is simple?
Thanks !
You should have a look at SmartThreadPool library. It is one of the best alternative to ThreadPool.
Its features (copied from source link)
Smart Thread Pool is a thread pool written in C#. The implementation was first based on Stephan Toub's thread pool with some extra features, but now, it is far beyond the original. Here is a list of the thread pool features:
The number of threads dynamically changes according to the workload on the threads in the pool.
Work items can return a value.
A work item can be cancelled if it hasn't been executed yet.
The caller thread's context is used when the work item is executed (limited).
Usage of minimum number of Win32 event handles, so the handle count of the application won't explode.
The caller can wait for multiple or all the work items to complete.
A work item can have a PostExecute callback, which is called as soon the work item is completed.
The state object that accompanies the work item can be disposed automatically.
Work item exceptions are sent back to the caller.
Work items have priority.
Work items group.
The caller can suspend the start of a thread pool and work items group.
Threads have priority.
Threads have initialization and termination events.
WinCE platform is supported (limited).
Action and Func generic methods are supported.
Silverlight is supported.
Mono is supported.
Performance counters (Windows and internal).
Work item timeout (passive).
Threads ApartmentState
Threads IsBakcground
Threads name template
Windows Phone is supported (limited)
Threads MaxStackSize
The problem is you are attempting to update a UI control from a background thread. Not allowed.
You need to do a BeginInvoke or Invoke in your ShowMessage function.
I'm new to event handling and threading in C#, so forgive me if this question is basic: How do I create several classes to run on different threads, all listening for the same event?
For example, I receive a new piece of data frequently but at random intervals. When this data arrives I update a class with the new data, let's call it MyDataClass, which raises an event: MyDataClass.NewEvent.
Then I have a class called NewEventHandler. When an event fires, this class does some calculation with the new data, then sends the result to another application.
So the problem is this:
I need to have around 30 instances of NewEventHandler all listening to MyDataClass.NewEvent (each does the different calculation and produces a different result). It's important that these calculations all run simultaneously - as soon as the event fires all 30 instances of NewEventHandler start calculating. However, it does not matter if they finish together or not (e.g. there's no synchronization needed here).
How do I actually create the instances of NewEventHandler so that they run on different threads, and get them to all listen to a single instance of MyDataClass.NewEvent?
In C#, the general practice is that the event listener's methods are called on the same thread which fires the event. The standard template to fire an event from the source class is:
void FireNewEvent()
{
var tmp = NewEvent;
if( tmp != null )
{
tmp(this, YourEventArgsInstance);
}
}
Event is not much but a little glorified delegate. So this call is similar to a multicast delegate call - meaning all the subscribers will be called sequentially on the same thread the FireNewEvent is running from. I suggest you do not change this behavior.
If you want to run event subscribers simultaneously then you start a new task in each subscriber.
...
MyDataClass.NewEvent += OneOfSubscriberClassInstance.OnNewEvent;
...
public void OnNewEvent(object sender, YourEventArgs args)
{
Task.Factory.StartNew( () => {
// all your event handling code here
});
}
Code which fires event will fire 30 subscribers sequentially, but each subscriber will run in its own thread scheduled by TPL. So, delegate which fires event will not have to wait to fire the next subscriber's handler until currently called subscriber's handler has finished handling the event.
Here is an example/demonstration of how you can synchronize the different threads and ensure they all respond to events at the same time. You can copy and paste this code into a console application to see it run.
public class Program
{
private static EventWaitHandle _waitHandle;
private const int ThreadCount = 20;
private static int _signalledCount = 0;
private static int _invokedCount = 0;
private static int _eventCapturedCount = 0;
private static CountdownEvent _startCounter;
private static CountdownEvent _invokeCounter;
private static CountdownEvent _eventCaptured;
public static void Main(string[] args)
{
_waitHandle = new EventWaitHandle(false, EventResetMode.ManualReset);
_startCounter = new CountdownEvent(ThreadCount);
_invokeCounter = new CountdownEvent(ThreadCount);
_eventCaptured = new CountdownEvent(ThreadCount);
//Start multiple threads that block until signalled
for (int i = 1; i <= ThreadCount; i++)
{
var t = new Thread(new ParameterizedThreadStart(ThreadProc));
t.Start(i);
}
//Allow all threads to start
Thread.Sleep(100);
_startCounter.Wait();
Console.WriteLine("Press ENTER to allow waiting threads to proceed.");
Console.ReadLine();
//Signal threads to start
_waitHandle.Set();
//Wait for all threads to acknowledge start
_invokeCounter.Wait();
//Signal threads to proceed
_waitHandle.Reset();
Console.WriteLine("All threads ready. Raising event.");
var me = new object();
//Raise the event
MyEvent(me, new EventArgs());
//Wait for all threads to capture event
_eventCaptured.Wait();
Console.WriteLine("{0} of {1} threads responded to event.", _eventCapturedCount, ThreadCount);
Console.ReadLine();
}
public static EventHandler MyEvent;
public static void ThreadProc(object index)
{
//Signal main thread that this thread has started
_startCounter.Signal();
Interlocked.Increment(ref _signalledCount);
//Subscribe to event
MyEvent += delegate(object sender, EventArgs args)
{
Console.WriteLine("Thread {0} responded to event.", index);
_eventCaptured.Signal();
Interlocked.Increment(ref _eventCapturedCount);
};
Console.WriteLine("Thread {0} blocks.", index);
//Wait for main thread to signal ok to start
_waitHandle.WaitOne();
//Signal main thread that this thread has been invoked
_invokeCounter.Signal();
Interlocked.Increment(ref _invokedCount);
}
}
I'm trying to implement a multithread in c#.
The basic idea is:
There will be a lot of thread process calling a function and every thread doesn't end in order they are called.
I want to limit maximum number of thread running.
When a thread is finished, I want to call another thread until everything is done.
So to implement this, I use a list of thread. For every second, I check if there the list is already full and if there's any thread finished its job.
Here is my code:
List<Thread> threadCompany = new List<Thread>();
List<Thread> fireThisGuy = new List<Thread>();
while (i < _finish) {
if (threadCompany.Count < _MAX_THREAD) {
Thread worker = new Thread(delegate() {
CallFunction(i);
});
threadCompany.Add(worker);
i++;
worker.Start();
}
Thread.Sleep(1000); //Wait for a while instead of keep calling if
foreach (Thread worker in threadCompany) {
if (!worker.IsAlive) {
fireThisGuy.Add(worker); //because threadCompany may not be
//modified in iteration.
}
}
foreach (Thread worker in fireThisGuy) {
threadCompany.Remove(worker);
}
fireThisGuy.Clear();
}
This works, but I don't think I'm being elegant and efficient here, how can I improve my code?
This is not the right way to solve the problem.
You don't need to keep a list of your threads, you just need to notify the application when all thread finish running.
This is a possible way to handle the problem using SynchronizationContext to notify the main thread when execution completes, without any kind of wait cycle.
public class OwnerClass
{
private SynchronizationContext syncContext;
private int count;
private int completedCount;
// This event will be raised when all thread completes
public event EventHandler Completed;
public OwnerClass() :
this(SynchronizationContext.Current)
{
}
public OwnerClass(SynchronizationContext context)
{
if (context == null)
throw new ArgumentNullException("context");
this.syncContext = context;
}
// Call this method to start running
public void Run(int threadsCount)
{
this.count = threadsCount;
for (int i = 0; i < threadsCount; ++i)
{
ThreadPool.QueueUserWorkItem(this.ThreadFunc, null);
}
}
private void ThreadFunc(object threadContext)
{
Thread.Sleep(1000); /// my long and complicated function
if (Interlocked.Increment(ref this.completedCount) >= this.count)
{
this.syncContext.Post(OnCompleted, null);
}
}
protected virtual void OnCompleted(object state)
{
var handler = this.Completed;
if (handler != null)
handler(this, EventArgs.Empty);
}
}
If you just want to you run in a multiprocessor machne assigning a thread for each processor you can just use Parallel.For
check out TPL and/or ThreadPool.
I have a timer calling a function every 15 minutes, this function counts the amount of lines in my DGV and starts a thread for each lines (of yet another function), said thread parse a web page which can take anywhere from 1 second to 10 second to finish.
Whilst it does work fine as it is with 1-6 rows, anymore will cause the requests to time-out.
I want it to wait for the newly created thread to finish processing before getting back in the loop to create another thread without locking the main UI
for (int x = 0; x <= dataGridFollow.Rows.Count - 1; x++)
{
string getID = dataGridFollow.Rows[x].Cells["ID"].Value.ToString();
int ID = int.Parse(getID);
Thread t = new Thread(new ParameterizedThreadStart(UpdateLo));
t.Start(ID);
// <- Wait for thread to finish here before getting back in the for loop
}
I have googled a lot in the past 24 hours, read a lot about this specific issue and its implementations (Thread.Join, ThreadPools, Queuing, and even SmartThreadPool).
It's likely that I've read the correct answer somewhere but I'm not at ease enough with C# to decypher those Threading tools
Thanks for your time
to avoid the UI freeze the framework provide a class expressly for these purposes: have a look at the BackgroundWorker class (executes an operation on a separate thread), here's some infos : http://msdn.microsoft.com/en-us/library/system.componentmodel.backgroundworker.aspx
http://msdn.microsoft.com/en-us/magazine/cc300429.aspx
Btw looks if I understand correctly you don't want to parallelize any operation so just wait for the method parsing the page to be completed. Basically for each (foreach look) row of your grid you get the id and call the method. If you want to go parallel just reuse the same foreach loop and add make it Parallel
http://msdn.microsoft.com/en-us/library/dd460720.aspx
What you want is to set off a few workers that do some task.
When one finishes you can start a new one off.
I'm sure there is a better way using thread pools or whatever.. but I was bored so i came up with this.
using System;
using System.Collections.Generic;
using System.Linq;
using System.ComponentModel;
using System.Threading;
namespace WorkerTest
{
class Program
{
static void Main(string[] args)
{
WorkerGroup workerGroup = new WorkerGroup();
Console.WriteLine("Starting...");
for (int i = 0; i < 100; i++)
{
var work = new Action(() =>
{
Thread.Sleep(1000); //somework
});
workerGroup.AddWork(work);
}
while (workerGroup.WorkCount > 0)
{
Console.WriteLine(workerGroup.WorkCount);
Thread.Sleep(1000);
}
Console.WriteLine("Fin");
Console.ReadLine();
}
}
public class WorkerGroup
{
private List<Worker> workers;
private Queue<Action> workToDo;
private object Lock = new object();
public int WorkCount { get { return workToDo.Count; } }
public WorkerGroup()
{
workers = new List<Worker>();
workers.Add(new Worker());
workers.Add(new Worker());
foreach (var w in workers)
{
w.WorkCompleted += (OnWorkCompleted);
}
workToDo = new Queue<Action>();
}
private void OnWorkCompleted(object sender, EventArgs e)
{
FindWork();
}
public void AddWork(Action work)
{
workToDo.Enqueue(work);
FindWork();
}
private void FindWork()
{
lock (Lock)
{
if (workToDo.Count > 0)
{
var availableWorker = workers.FirstOrDefault(x => !x.IsBusy);
if (availableWorker != null)
{
var work = workToDo.Dequeue();
availableWorker.StartWork(work);
}
}
}
}
}
public class Worker
{
private BackgroundWorker worker;
private Action work;
public bool IsBusy { get { return worker.IsBusy; } }
public event EventHandler WorkCompleted;
public Worker()
{
worker = new BackgroundWorker();
worker.DoWork += new DoWorkEventHandler(OnWorkerDoWork);
worker.RunWorkerCompleted += new RunWorkerCompletedEventHandler(OnWorkerRunWorkerCompleted);
}
private void OnWorkerRunWorkerCompleted(object sender, RunWorkerCompletedEventArgs e)
{
if (WorkCompleted != null)
{
WorkCompleted(this, EventArgs.Empty);
}
}
public void StartWork(Action work)
{
if (!IsBusy)
{
this.work = work;
worker.RunWorkerAsync();
}
else
{
throw new InvalidOperationException("Worker is busy");
}
}
private void OnWorkerDoWork(object sender, DoWorkEventArgs e)
{
work.Invoke();
work = null;
}
}
}
This would be just a starting point.
You could start it off with a list of Actions and then have a completed event for when that group of actions is finished.
then at least you can use a ManualResetEvent to wait for the completed event.. or whatever logic you want really.
Call a method directly or do a while loop (with sleep calls) to check the status of the thread.
There are also async events but the would call another method, and you want to continue from the same point.
I have no idea why the requests would timeout. That sounds like a different issue. However, I can make a few suggestions regarding your current approach.
Avoid creating threads in loops with nondeterministic bounds. There is a lot of overhead in creating threads. If the number of operations is not known before hand then use the ThreadPool or the Task Parallel Library instead.
You are not going to get the behavior you want by blocking the UI thread with Thread.Join. The cause the UI to become unresponsive and it will effectively serialize the operations and cancel out any advantage you were hoping to gain with threads.
If you really want to limit the number of concurrent operations then a better solution is to create a separate dedicated thread for kicking off the operations. This thread will spin around a loop indefinitely waiting for items to appear in a queue and when they do it will dequeue them and use that information to kick off an operation asynchronously (again using the ThreadPool or TPL). The dequeueing thread can contain the logic for limiting the number of concurrent operations. Search for information regarding the producer-consumer pattern to get a better understand of how you can implement this.
There is a bit of a learning curve, but who said threading was easy right?
If I understand correctly, what you're currently doing is looping through a list of IDs in the UI thread, starting a new thread to handle each one. The blocking issue you're seeing then could well be that it's taking too many resources to create unique threads. So, personally (without knowing more) would redesign the process like so:
//Somewhere in the UI Thread
Thread worker = new Thread(new ParameterizedThreadStart(UpdateLoWorker));
worker.Start(dataGridFollow.Rows);
//worker thread
private void UpdateLoWorker(DataRowCollection rows)
{
foreach(DataRow r in rows){
string getID = r.Cells["ID"].Value.ToString();
int ID = int.Parse(getID);
UpdateLo(ID);
}
}
Here you'd have a single non-blocking worker which sequentially handles each ID.
Consider using Asynchronous CTP. It's an asynch pattern Microsoft recently released for download. It should simplify asynch programming tremendouesly. The link is http://msdn.microsoft.com/en-us/vstudio/async.aspx. (Read the whitepaper first)
Your code would look something like the following. (I've not verified my syntax yet, sorry).
private async Task DoTheWork()
{
for(int x = 0; x <= dataGridFollow.Rows.Count - 1; x++)
{
string getID = dataGridFollow.Rows[x].Cells["ID"].Value.ToString();
int ID = int.Parse(getID);
task t = new Task(new Action<object>(UpdateLo), ID);
t.Start();
await t;
}
}
This method returns a Task that can be checked periodically for completion. This follows the pattern of "fire and forget" meaning you just call it and presumably, you don't care when it completes (as long as it does complete before 15 minutes).
EDIT
I corrected the syntax above, you would need to change UpdateLo to take an object instead of an Int.
For a simple background thread runner that will run one thread from a queue at a time you can do something like this:
private List<Thread> mThreads = new List<Thread>();
public static void Main()
{
Thread t = new Thread(ThreadMonitor);
t.IsBackground = true;
t.Start();
}
private static void ThreadMonitor()
{
while (true)
{
foreach (Thread t in mThreads.ToArray())
{
// Runs one thread in the queue and waits for it to finish
t.Start();
mThreads.Remove(t);
t.Join();
}
Thread.Sleep(2000); // Wait before checking for new threads
}
}
// Called from the UI or elsewhere to create any number of new threads to run
public static void DoStuff()
{
Thread t = new Thread(DoCorestuff);
t.IsBackground = true;
mActiveThreads.Add(t);
}
public static void DoStuffCore()
{
// Your code here
}