I see online that it says I use myThread.Join(); when I want to block my thread until another thread finishes. (One of the things I don't get about this is what if I have multiple threads).
But generally, I just don't get when I'd use .Join() or a condition that it's useful for. Can anyone please explain this to me like I'm a fourth grader? Very simple explanation to understand will get my answer vote.
Let's say you want to start some worker threads to perform some kind of calculation, and then do something afterwards with all the results.
List<Thread> workerThreads = new List<Thread>();
List<int> results = new List<int>();
for (int i = 0; i < 5; i++) {
Thread thread = new Thread(() => {
Thread.Sleep(new Random().Next(1000, 5000));
lock (results) {
results.Add(new Random().Next(1, 10));
}
});
workerThreads.Add(thread);
thread.Start();
}
// Wait for all the threads to finish so that the results list is populated.
// If a thread is already finished when Join is called, Join will return immediately.
foreach (Thread thread in workerThreads) {
thread.Join();
}
Debug.WriteLine("Sum of results: " + results.Sum());
Oh yeah, and don't use Random like that, I was just trying to write a minimal, easily understandable example. It ends up not really being random if you create new Random instances too close in time, since the seed is based on the clock.
In the following code snippet, the main thread calls Join() which causes it to wait for all spawned threads to finish:
static void Main()
{
Thread regularThread = new Thread(ThreadMethod);
regularThread.Start();
Thread regularThread2 = new Thread(ThreadMethod2);
regularThread2.Start();
// Wait for spawned threads to end.
regularThread.Join();
Console.WriteLine("regularThread returned.");
regularThread2.Join();
Console.WriteLine("regularThread2 returned.");
}
Note that if you also spun up a thread from the thread pool (using QueueUserWorkItem for instance), Join would not wait for that background thread. You would need to implement some other mechanism such as using an AutoResetEvent.
For an excellent introduction to threading, I recommend reading Joe Albahari's free Threading in C#
This is very simple program to demonstrate usage of Thread Join.Please follow my comments for better understanding.Write this program as it is.
using System;
using System.Threading;
namespace ThreadSample
{
class Program
{
static Thread thread1, thread2;
static int sum=0;
static void Main(string[] args)
{
start();
Console.ReadKey();
}
private static void Sample() { sum = sum + 1; }
private static void Sample2() { sum = sum + 10; }
private static void start()
{
thread1 = new Thread(new ThreadStart(Sample));
thread2 = new Thread(new ThreadStart(Sample2));
thread1.Start();
thread2.Start();
// thread1.Join();
// thread2.Join();
Console.WriteLine(sum);
Console.WriteLine();
}
}
}
1.First time run as it is (with comments) : Then result will be 0(initial value) or 1(when thread 1 finished) or 10 (Or thread finished)
2.Run with removing comment thread1.Join() : Result should be always more than 1.because thread1.Join() fired and thread 1 should be finished before get the sum.
3.Run with removing all coments :Result should be always 11
Join is used mainly when you need to wait that a thread (or a bunch of them) will terminate before proceding with your code.
For this reason is also particular useful when you need to collect result from a thread execution.
As per the Arafangion comment below, it's also important to join threads if you need to do some cleaning/housekeeping code after having created a thread.
Join will make sure that the treads above line is executed before executing lines below.
Another example, when your worker thread let's say reads from an input stream while the read method can run forever and you want to somehow avoid this - by applying timeout using another watchdog thread:
// worker thread
var worker = new Thread(() => {
Trace.WriteLine("Reading from stream");
// here is the critical area of thread, where the real stuff happens
// Sleep is just an example, simulating any real operation
Thread.Sleep(10000);
Trace.WriteLine("Reading finished");
}) { Name = "Worker" };
Trace.WriteLine("Starting worker thread...");
worker.Start();
// watchdog thread
ThreadPool.QueueUserWorkItem((o) => {
var timeOut = 5000;
if (!worker.Join(timeOut))
{
Trace.WriteLine("Killing worker thread after " + timeOut + " milliseconds!");
worker.Abort();
}
});
Adding a delay of 300ms in method "Sample" and a delay of 400ms in "Sample2" from devopsEMK's post would make it easier to understand.
By doing so you can observe that by removing the comment from "thread1.Join();" line, the main thread waits for the "thread1" to complete and only after moves on.
Related
On a console application, i am currently starting an array of threads. The thread is passed an object and running a method in it. I would like to know how to call a method on the object inside the individual running threads.
Dispatcher doesn't work. SynchronizationContext "Send" runs on the calling thread and "Post" uses a new thread. I would like to be able to call the method and pass parameters on a running thread on the target thread it's running on and not the calling thread.
Update 2: Sample code
using System;
using System.Collections.Generic;
using System.Data.SqlClient;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace CallingFromAnotherThread
{
class Program
{
static void Main(string[] args)
{
var threadCount = 10;
var threads = new Thread[threadCount];
Console.WriteLine("Main on Thread " + Thread.CurrentThread.ManagedThreadId);
for (int i = 0; i < threadCount; i++)
{
Dog d = new Dog();
threads[i] = new Thread(d.Run);
threads[i].Start();
}
Thread.Sleep(5000);
//how can i call dog.Bark("woof");
//on the individual dogs and make sure they run on the thread they were created on.
//not on the calling thread and not on a new thread.
}
}
class Dog
{
public void Run()
{
Console.WriteLine("Running on Thread " + Thread.CurrentThread.ManagedThreadId);
}
public void Bark(string text)
{
Console.WriteLine(text);
Console.WriteLine("Barking on Thread " + Thread.CurrentThread.ManagedThreadId);
}
}
}
Update 1:
Using synchronizationContext.Send results to using the calling thread
Channel created
Main thread 10
SyncData Added for thread 11
Consuming channel ran on thread 11
Calling AddConsumer on thread 10
Consumer added consumercb78b. Executed on thread 10
Calling AddConsumer on thread 10
Consumer added consumer783c4. Executed on thread 10
Using synchronizationContext.Post results to using a different thread
Channel created
Main thread 10
SyncData Added for thread 11
Consuming channel ran on thread 11
Calling AddConsumer on thread 12
Consumer added consumercb78b. Executed on thread 6
Calling AddConsumer on thread 10
Consumer added consumer783c4. Executed on thread 7
The target thread must run the code "on itself" - or it is just accessing the object across threads. This is done with some form of event dispatch loop on the target thread itself.
The SynchronizationContext abstraction can and does support this if the underlying provider supports it. For example in either WinForms or WPF (which themselves use the "window message pump") using Post will "run on the UI thread".
Basically, all such constructs follow some variation of the pattern:
// On "target thread"
while (running) {
var action = getNextDelegateFromQueue();
action();
}
// On other thread
postDelegateToQueue(actionToDoOnTargetThread);
It is fairly simple to create a primitive queue system manually - just make sure to use the correct synchronization guards. (Although I am sure there are tidy "solved problem" libraries out there; including wrapping everything up into a SynchronizationContext.)
Here is a primitive version of the manual queue. Note that there may be is1 a race condition.. but, FWIW:
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
namespace DogPark
{
internal class DogPark
{
private readonly string _parkName;
private readonly Thread _thread;
private readonly ConcurrentQueue<Action> _actions = new ConcurrentQueue<Action>();
private volatile bool _isOpen;
public DogPark(string parkName)
{
_parkName = parkName;
_isOpen = true;
_thread = new Thread(OpenPark);
_thread.Name = parkName;
_thread.Start();
}
// Runs in "target" thread
private void OpenPark(object obj)
{
while (true)
{
Action action;
if (_actions.TryDequeue(out action))
{
Program.WriteLine("Something is happening at {0}!", _parkName);
try
{
action();
}
catch (Exception ex)
{
Program.WriteLine("Bad dog did {0}!", ex.Message);
}
}
else
{
// Nothing left!
if (!_isOpen && _actions.IsEmpty)
{
return;
}
}
Thread.Sleep(0); // Don't toaster CPU
}
}
// Called from external thread
public void DoItInThePark(Action action)
{
if (_isOpen)
{
_actions.Enqueue(action);
}
}
// Called from external thread
public void ClosePark()
{
_isOpen = false;
Program.WriteLine("{0} is closing for the day!", _parkName);
// Block until queue empty.
while (!_actions.IsEmpty)
{
Program.WriteLine("Waiting for the dogs to finish at {0}, {1} actions left!", _parkName, _actions.Count);
Thread.Sleep(0); // Don't toaster CPU
}
Program.WriteLine("{0} is closed!", _parkName);
}
}
internal class Dog
{
private readonly string _name;
public Dog(string name)
{
_name = name;
}
public void Run()
{
Program.WriteLine("{0} is running at {1}!", _name, Thread.CurrentThread.Name);
}
public void Bark()
{
Program.WriteLine("{0} is barking at {1}!", _name, Thread.CurrentThread.Name);
}
}
internal class Program
{
// "Thread Safe WriteLine"
public static void WriteLine(params string[] arguments)
{
lock (Console.Out)
{
Console.Out.WriteLine(arguments);
}
}
private static void Main(string[] args)
{
Thread.CurrentThread.Name = "Home";
var yorkshire = new DogPark("Yorkshire");
var thunderpass = new DogPark("Thunder Pass");
var bill = new Dog("Bill the Terrier");
var rosemary = new Dog("Rosie");
bill.Run();
yorkshire.DoItInThePark(rosemary.Run);
yorkshire.DoItInThePark(rosemary.Bark);
thunderpass.DoItInThePark(bill.Bark);
yorkshire.DoItInThePark(rosemary.Run);
thunderpass.ClosePark();
yorkshire.ClosePark();
}
}
}
The output should look about like the following - keep in mind that this will change when run multiples times due to the inherent nature of non-synchronized threads.
Bill the Terrier is running at Home!
Something is happening at Thunder Pass!
Something is happening at Yorkshire!
Rosie is running at Yorkshire!
Bill the Terrier is barking at Thunder Pass!
Something is happening at Yorkshire!
Rosie is barking at Yorkshire!
Something is happening at Yorkshire!
Rosie is running at Yorkshire!
Thunder Pass is closing for the day!
Thunder Pass is closed!
Yorkshire is closing for the day!
Yorkshire is closed!
There is nothing preventing a dog from performing at multiple dog parks simultaneously.
1 There is a race condition present and it is this: a park may close before the last dog action runs.
This is because the dog park thread dequeues the action before the action is run - and the method to close the dog park only waits until all the actions are dequeued.
There are multiple ways to address it, for instance:
The concurrent queue could first peek-use-then-dequeue-after-the-action, or
A separate volatile isClosed-for-real flag (set from the dog park thread) could be used, or ..
I've left the bug in as a reminder of the perils of threading..
A running thread is already executing a method. You cannot directly force that thread to leave the method and enter a new one. However, you could send information to that thread to leave the current method and do something else. But this only works if the executed method can react on that passed information.
In general, you can use threads to call/execute methods, but you cannot call a method ON a running thread.
Edit, based on your updates:
If you want to use the same threads to execute dog.run and dog.bark, and do it in the same objects, the you need to modify your code:
static void Main(string[] args)
{
var threadCount = 10;
var threads = new Thread[threadCount];
Console.WriteLine("Main on Thread " + Thread.CurrentThread.ManagedThreadId);
// keep the dog objects outside the creation block in order to access them later again. Always useful.
Dog[] dogs = New Dog[threadCount];
for (int i = 0; i < threadCount; i++)
{
dogs[i] = new Dog();
threads[i] = new Thread(d.Run);
threads[i].Start();
}
Thread.Sleep(5000);
//how can i call dog.Bark("woof") --> here you go:
for (int i = 0; i < threadCount; i++)
{
threads[i] = new Thread(d.Bark);
threads[i].Start();
}
// but this will create NEW threads because the others habe exited after finishing d.run, and habe been deleted. Is this a problem for you?
// maybe the other threads are still running, causing a parallel execution of d.run and d.bark.
//on the individual dogs and make sure they run on the thread they were created on.
//not on the calling thread and not on a new thread. -->
// instead of d.run, call d.doActions and loop inside that function, check for commands from external sources:
for (int i = 0; i < threadCount; i++)
{
threads[i] = new Thread(d.doActions);
threads[i].Start();
}
// but in this case there will be sequential execution of actions. No parallel run and bark.
}
Inside your dog class:
Enum class EnumAction
{
Nothing,
Run,
bark,
exit,
};
EnumAction m_enAction;
Object m_oLockAction;
void SetAction (EnumAction i_enAction)
{
Monitor.Enter (m_oLockAction);
m_enAction = i_enAction;
Monitor.Exit (m_oLockAction);
}
void SetAction (EnumAction i_enAction)
{
Monitor.Enter (m_oLockAction);
m_enAction = i_enAction;
Monitor.Exit (m_oLockAction);
}
Void doActions()
{
EnumAction enAction;
Do
{
Thread.sleep(20);
enAction = GetAction();
Switch(enAction)
{
Case EnumAction.run:
Run(); break;
Case ...
}
} while (enAction != EnumAction.exit);
}
Got it? ;-)
Sorry for any typos, I was typing on my mobile phone, and I usually use C++CLI.
Another advice: as you would read the variable m_enAction inside the thread and write it from outside, you need to ensure that it gets updated properly due to the access from different threads. The threads MUST NOT cache the variable in the CPU, otherwise they don't see it changing. Use locks (e.g. Monitor) to achieve that. (But do not use a Monitor on m_enAction, because you can use Monitors only on objects. Create a dummy object for this purpose.)
I have added the necessary code. Check out the differences between the edits to see the changes.
You cannot run second method while first method is running. If you want them to run in parallel you need another thread. However, your object needs to be thread safe.
Execution of thread simply means execution of sequence of instruction. Dispatcher is nothing else than an infinite loop that executes queued method one after another.
I recommend you to use tasks instead of threads. Use Parallel.ForEach to run Dog.Run method on each dog object instance. To run Bark method use Task.Run(dog.Bark).
Since you used running and barking dog as an example you could write your own "dispatcher". That means infinite loop that would execute all queued work. In that case you could have all dogs in single thread. Sounds weird, but you could have unlimited amount of dogs. At the end, only as many threads can be executed at the same time as many CPU cores is available
I have two threads which use two different functions. First one to search from start to end and the second one to search from end to start.
Now I'm using Thread.Sleep(10) for synchronisation, but it takes too much time, and testing is not possible in such condition.
Any idea how can I sync two threads with different functions?
It depends slightly on what you want to do.
If you have two threads and you just want to exit one when the other reaches "success" (or n threads and you want to exit them all when one reaches "success" first) you just need to periodically check for success on each thread.
Use Interlocked to do this without locks, or some other mechanism (see below)
Use cancellable Task objects
If you need to do your search in phases, where each thread does something and then waits for the other to catch up, you need a different approach.
Use Barrier
Given that you are doing an A*-search you likely need a combination of all two/three anyway:
Barrier to coordinate the steps and update the open set between steps
Success signalling to work out when to exit threads if another thread succeeded
Task objects with CancellationToken to allow callers to cancel the search.
Another answer suggested Semaphore - this is not really suitable for your needs (see comments below).
Barrier can be used for searches such as this by:
enter step 0 of the algorithm
n threads split the current level into equal parts and work on each half, when each completes then it signals and waits for the other thread
when all threads are ready, proceed to the next step and repeat the search
Simple check for exit - Interlocked
The first part is checking for success. If you want to stay "lockless", you can use Interlocked to do this, the general pattern is:
// global success indicator
private const int NotDone = 0;
private const int AllDone = 1;
private int _allDone = NotDone;
private GeneralSearchFunction(bool directionForward) {
bool iFoundIt = false;
... do some search operations that won't take much time
if (iFoundIt) {
// set _allDone to AllDone!
Interlocked.Exchange(ref _allDone, AllDone);
return;
}
... do more work
// after one or a few iterations, if this thread is still going
// see if another thread has set _allDone to AllDone
if (Interlocked.CompareExchange(ref _allDone, NotDone, NotDone) == AllDone) {
return; // if they did, then exit
}
... loop to the top and carry on working
}
// main thread:
Thread t1 = new Thread(() => GeneralSearchFunction(true));
Thread t2 = new Thread(() => GeneralSearchFunction(false));
t1.Start(); t2.Start(); // start both
t1.Join(); t2.Join();
// when this gets to here, one of them will have succeeded
This is the general pattern for any kind of success or cancellation token:
do some work
if you succeed, set a signal every other thread checks periodically
if you haven't yet succeeded then in the middle of that work, either every iteration, or every few iterations, check to see if this thread should exit
So an implementation would look like:
class Program
{
// global success indicator
private const int NotDone = 0;
private const int AllDone = 1;
private static int _allDone = NotDone;
private static int _forwardsCount = 0; // counters to simulate a "find"
private static int _backwardsCount = 0; // counters to simulate a "find"
static void Main(string[] args) {
var searchItem = "foo";
Thread t1 = new Thread(() => DoSearchWithBarrier(SearchForwards, searchItem));
Thread t2 = new Thread(() => DoSearchWithBarrier(SearchBackwards, searchItem));
t1.Start(); t2.Start();
t1.Join(); t2.Join();
Console.WriteLine("all done");
}
private static void DoSearchWithBarrier(Func<string, bool> searchMethod, string searchItem) {
while (!searchMethod(searchItem)) {
// after one or a few iterations, if this thread is still going
// see if another thread has set _allDone to AllDone
if (Interlocked.CompareExchange(ref _allDone, NotDone, NotDone) == AllDone) {
return; // if they did, then exit
}
}
Interlocked.Exchange(ref _allDone, AllDone);
}
public static bool SearchForwards(string item) {
// return true if we "found it", false if not
return (Interlocked.Increment(ref _forwardsCount) == 10);
}
public static bool SearchBackwards(string item) {
// return true if we "found it", false if not
return (Interlocked.Increment(ref _backwardsCount) == 20); // make this less than 10 to find it backwards first
}
}
Using Tasks to the same end
Of course, this wouldn't be .NET 4.5 without using Task:
class Program
{
private static int _forwardsCount = 0; // counters to simulate a "find"
private static int _backwardsCount = 0; // counters to simulate a "find"
static void Main(string[] args) {
var searchItem = "foo";
var tokenSource = new CancellationTokenSource();
var allDone = tokenSource.Token;
Task t1 = Task.Factory.StartNew(() => DoSearchWithBarrier(SearchForwards, searchItem, tokenSource, allDone), allDone);
Task t2 = Task.Factory.StartNew(() => DoSearchWithBarrier(SearchBackwards, searchItem, tokenSource, allDone), allDone);
Task.WaitAll(new[] {t2, t2});
Console.WriteLine("all done");
}
private static void DoSearchWithBarrier(Func<string, bool> searchMethod, string searchItem, CancellationTokenSource tokenSource, CancellationToken allDone) {
while (!searchMethod(searchItem)) {
if (allDone.IsCancellationRequested) {
return;
}
}
tokenSource.Cancel();
}
...
}
However, now you have used the CancellationToken for the wrong things - really this should be kept for the caller of the search to cancel the search, so you should use CancellationToken to check for a requested cancellation (only the caller needs tokenSource then), and a different success synchronisation (such as the Interlocked sample above) to exit.
Phase/step synchronisation
This gets harder for many reasons, but there is a simple approach. Using Barrier (new to .NET 4) in conjunction with an exit signal you can:
Perform the assigned thread's work for the current step, and then wait for the other thread to catch up before doing another iteration
Exit both threads when one succeeds
There are many different approaches for thread sync, depending on exactly what you want to achieve. Some are:
Barrier: This is probably the most suitable if you are intending for both your forwards and backwards searches to run at the same time. It also screams out your intent, i.e. "all threads can't go on until they everyone reaches a barrier"
ManualResetEvent - when one thread releases a signal, all others can proceed until it is set again. AutoResetEvent is similar, except it only allows one thread to proceed before blocking again.
Interlocked - in combination with SpinWait this is a viable lockless solution
Semaphore - possible to use, but not really suited for your scenario
I have only provided a full sample for Barrier here as it seems the most suitable in your case. Barrier is one of the most performant, second only to ManualResetEventSlim (ref. albahari), but using ManualResetEvent will need more complex code.
Other techniques to look at, if none of the above work for you are Monitor.Wait and Monitor.Pulse (now you're using locking) and Task Continuations. The latter is more used for passing data from one async operation to another, but it could be used for your scenario. And, as with the samples at the top of the answer, you are more likely to combine Task with Barrier than use one instead of the other. Task Continuations could be used to do the post-step revision of the open set in the A*-search, but you can just as easily use Barrier for that anyway.
This code, using Barrier works. In essence, DoSearchWithBarrier is the only bit doing the synchronisation - all the rest is setup and teardown code.
class Program {
...
private static int _forwardsCount = 0; // counters to simulate a "find"
private static int _backwardsCount = 0; // counters to simulate a "find"
static void Main(string[] args) {
Barrier barrier = new Barrier(numThreads,
b => Console.WriteLine("Completed search iteration {0}", b.CurrentPhaseNumber));
var searchItem = "foo";
Thread t1 = new Thread(() => DoSearchWithBarrier(SearchForwards, searchItem, barrier));
Thread t2 = new Thread(() => DoSearchWithBarrier(SearchBackwards, searchItem, barrier));
t1.Start(); Console.WriteLine("Started t1");
t2.Start(); Console.WriteLine("Started t2");
t1.Join(); Console.WriteLine("t1 done");
t2.Join(); Console.WriteLine("t2 done");
Console.WriteLine("all done");
}
private static void DoSearchWithBarrier(Func<string, bool> searchMethod, string searchItem, Barrier barrier) {
while (!searchMethod(searchItem)) {
// while we haven't found it, wait for the other thread to catch up
barrier.SignalAndWait(); // check for the other thread AFTER the barrier
if (Interlocked.CompareExchange(ref _allDone, NotDone, NotDone) == AllDone) {
return;
}
}
// set success signal on this thread BEFORE the barrier
Interlocked.Exchange(ref _allDone, AllDone);
// wait for the other thread, and then exit (and it will too)
barrier.SignalAndWait();
}
...
}
There are two things going on here:
Barrier is used to synchronise the two threads so they can't do their next step until the other has caught up
The exit signal uses Interlocked, as I first described.
Implementing this for A* searches is very similar to the above sample. Once all threads reach the barrier and therefore continue you could use a ManualResetEvent or a simple lock to then let one (and only one) revise the open set.
A note on Semaphore
This is probably not what you want as it's most often used when you have a limited pool of resources, with more resource users requiring access than you have resources.
Think of the PlayStation with CoD on it in the corner of the work canteen - 4 controllers, 20 people waiting (WaitOne) to use it, as soon as your character dies you Release the controller and someone else takes your place. No particular FIFO/LIFO ordering is enforced, and in fact Release can be called by the bouncer you employ to prevent the inevitable fights (i.e. thread identity is not enforced).
Simple check for exit - other approaches
Use of lock for simple success indication
You can achieve the same with locking. Both Interlocked and lock ensure you don't see any memory cache issues with reading a common variable between threads:
private readonly object _syncAllDone = new object();
...
if (iFoundIt) {
lock (_syncAllDone) { _allDone = AllDone };
return;
}
...
// see if another thread has set _allDone to AllDone
lock (_syncAllDone) {
if (_allDone == AllDone) {
return; // if they did, then exit
}
}
The disadvantage of this is that locking may well be slower, but you need to test for your situation. The advantage is that if you are using lock anyway to do other things such as writing out results from your thread, you don't have any extra overhead.
Use of ManualResetEvent for simple success indication
This is not really the intended use of reset events, but it can work. (If using .NET 4 or later, use ManualResetEventSlim instead of ManualResetEvent):
private ManualResetEvent _mreAllDone = new ManualResetEvent(true); // will not block a thread
...
if (iFoundIt) {
_mreAllDone.Reset(); // stop other threads proceeding
return;
}
...
// see if another thread has reset _mreAllDone by testing with a 0 timeout
if (!_mreAllDone.WaitOne(0)) {
return; // if they did, then exit
}
Phase synchronisation - other approaches
All of the other approaches get a lot more complex, as you have to do two-way continuation checks to prevent race conditions and permanently blocked threads. I don't recommend them, so I won't provide a sample here (it would be long and complicated).
References:
Interlocked
ManualResetEvent
MSDN - ManualResetEvent and ManualResetEventSlim
Barrier
MSDN - Continuation Tasks
MSDN - Task Cancellation
Semaphore
thread.Join()
is possibly what your after. This will make your current thread block until the other thread ends.
It's possible to Join multiple threads there by syncing all of them to one point.
List<Thread> threads = new List<Thread>();
threads.Add(new Thread(new ThreadStart(<Actual method here>)));
threads.Add(new Thread(new ThreadStart(<Another method here>)));
threads.Add(new Thread(new ThreadStart(<Another method here>)));
foreach(Thread thread in threads)
{
thread.Start();
}
//All your threads are now running
foreach(Thread thread in threads)
{
thread.Join();
}
//You wont get here until all those threads have finished
In some cases You can use AutoResetEvent to wait some result from thread.
You can use Task's for start/stop/wait result of some workers.
You can use Producer/Consumer pattern with BlockingCollection in case your functions eat some data and returns collection of something.
Requirement :- At any given point of time only 4 threads should be calling four different functions. As soon as these threads complete, next available thread should call the same functions.
Current code :- This seems to be the worst possible way to achieve something like this. While(True) will cause unnecessary CPU spikes and i could see CPU rising to 70% when running the following code.
Question :- How can i use AutoResetEventHandler to signal Main thread Process() function to start next 4 threads again once the first 4 worker threads are done processing without wasting CPU cycles. Please suggest
public class Demo
{
object protect = new object();
private int counter;
public void Process()
{
int maxthread = 4;
while (true)
{
if (counter <= maxthread)
{
counter++;
Thread t = new Thread(new ThreadStart(DoSomething));
t.Start();
}
}
}
private void DoSomething()
{
try
{
Thread.Sleep(50000); //simulate long running process
}
finally
{
lock (protect)
{
counter--;
}
}
}
You can use TPL to achieve what you want in a simpler way. If you run the code below you'll notice that an entry is written after each thread terminates and only after all four threads terminate the "Finished batch" entry is written.
This sample uses the Task.WaitAll to wait for the completion of all tasks. The code uses an infinite loop for illustration purposes only, you should calculate the hasPendingWork condition based on your requirements so that you only start a new batch of tasks if required.
For example:
private static void Main(string[] args)
{
bool hasPendingWork = true;
do
{
var tasks = InitiateTasks();
Task.WaitAll(tasks);
Console.WriteLine("Finished batch...");
} while (hasPendingWork);
}
private static Task[] InitiateTasks()
{
var tasks = new Task[4];
for (int i = 0; i < tasks.Length; i++)
{
int wait = 1000*i;
tasks[i] = Task.Factory.StartNew(() =>
{
Thread.Sleep(wait);
Console.WriteLine("Finished waiting: {0}", wait);
});
}
return tasks;
}
One other thing, from the textual requirement section on your question I'm lead to believe that a batch of four new threads should only start after all previously four threads completed. However the code you posted is not compatible with that requirement, since it starts a new thread immediately after a previous thread terminate. You should clarify what exactly is your requirement.
UPDATE:
If you want to start a thread immediately after one of the four threads terminate you can still use TPL instead of starting new threads explicitly but you can limit the number of running threads to four by using a SemaphoreSlim. For example:
private static SemaphoreSlim TaskController = new SemaphoreSlim(4);
private static void Main(string[] args)
{
var random = new Random(570);
while (true)
{
// Blocks thread without wasting CPU
// if the number of resources (4) is exhausted
TaskController.Wait();
Task.Factory.StartNew(() =>
{
Console.WriteLine("Started");
Thread.Sleep(random.Next(1000, 3000));
Console.WriteLine("Completed");
// Releases a resource meaning TaskController.Wait will unblock
TaskController.Release();
});
}
}
I have a c# console application that creates up to 5 threads.
The threads are executing fine, but the UI thread shuts down as it finishes its work.
Is there a way to keep the main UI thread running, for as long as the side threads are running?
foreach (var url in urls)
{
Console.WriteLine("starting thread: " + url);
ThreadPool.QueueUserWorkItem(new System.Threading.WaitCallback(myMethod), url);
}
I'm kicking off my threads as per the code above.
The threads in the ThreadPool are background threads and that means an exiting application won't wait for them to complete.
You have a few options:
wait with for example a semaphore
wait on a counter with Sleep() , very crude but OK for a simple console app.
use the TPL, Parallel.ForEach(urls, url => MyMethod(url));
If you are using .NET 4.0:
var tasks = new List<Task>();
foreach(var url in urls)
{
tasks.Add(Task.Factory.StartNew(myMethod, url));
}
// do other stuff...
// On shutdown, give yourself X number of seconds to wait for them to complete...
Task.WaitAll(tasks.ToArray(), TimeSpan.FromSeconds(30));
Ah - the ThreadPool is background. It is queued, but then your program ends. Finished. Program terminates.
Read up on Semaphores (WaitSignal) and wait- The threads in the callback at the end signal they are ended, when all have signaled that the main thread can continue.
If you're using .net 4 then:
urls.AsParallel().ForAll(MyMethod);
Prior to .net 4 then start individual threads, keep them in a list and call Join(). The fact the workers are not background would keep them alive after the main thread exited, but the Join() is more explicit.
List<Thread> workers = new List<Thread>();
foreach(var url in urls)
{
Thread t = new Thread(MyMethod) {IsBackground = false};
workers.Add(t);
t.Start(url);
}
foreach (var worker in workers)
{
worker.Join();
}
Simplest hack to fix your problem.
In your program class:
static volatile int ThreadsComplete = 0;
In your "myMethod" at the end before return:
//ThreadsComplete++; //*edit* for safety's sake
Interlocked.Increment(ref ThreadsComplete);
In your main method before it returns/ends:
while(ThreadsComplete < urls.Count) { Thread.Sleep(10); }
The above essentially hacks together a WaitForAll synchronization method.
in Main:
var m = new ManualResetEvent(false);
// do something
foreach (var url in urls)
{
Console.WriteLine("starting thread: " + url);
ThreadPool.QueueUserWorkItem(new System.Threading.WaitCallback(myMethod), url);
}
m.WaitOne();
private static void myMethod(object obj)
{
try{
// do smt
}
finally {
m.Set();
}
}
I am creating a thread A and in that thread creating a new thread B.
So how is the thread hierarchy? Thread B is child of Thread A? Or the threads are created as peers?
I want to abort the parent thread A which in turn kills/aborts its child threads.
How is that possible in C#?
Threads should ideally never be aborted. It simply isn't safe. Consider this as a way of putting down an already sick process. Otherwise, avoid like the plague.
The more correct way of doing this is to have something that the code can periodically check, and itself decide to exit.
An example of stopping threads the polite way:
using System;
using System.Threading;
namespace Treading
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Main program starts");
Thread firstThread = new Thread(A);
ThreadStateMessage messageToA = new ThreadStateMessage(){YouShouldStopNow = false};
firstThread.Start(messageToA);
Thread.Sleep(50); //Let other threads do their thing for 0.05 seconds
Console.WriteLine("Sending stop signal from main program!");
messageToA.YouShouldStopNow = true;
firstThread.Join();
Console.WriteLine("Main program ends - press any key to exit");
Console.Read();//
}
private class ThreadStateMessage
{
public bool YouShouldStopNow = false; //this assignment is not really needed, since default value is false
}
public static void A(object param)
{
ThreadStateMessage myMessage = (ThreadStateMessage)param;
Console.WriteLine("Hello from A");
ThreadStateMessage messageToB = new ThreadStateMessage();
Thread secondThread = new Thread(B);
secondThread.Start(messageToB);
while (!myMessage.YouShouldStopNow)
{
Thread.Sleep(10);
Console.WriteLine("A is still running");
}
Console.WriteLine("Sending stop signal from A!");
messageToB.YouShouldStopNow = true;
secondThread.Join();
Console.WriteLine("Goodbye from A");
}
public static void B(object param)
{
ThreadStateMessage myMessage = (ThreadStateMessage)param;
Console.WriteLine("Hello from B");
while(!myMessage.YouShouldStopNow)
{
Thread.Sleep(10);
Console.WriteLine("B is still running");
}
Console.WriteLine("Goodbye from B");
}
}
}
Using Thread.Abort(); causes an exception to be thrown if your thread is in a waiting state of any kind. This is sort of annoying to handle, since there are quite a number of ways that a thread can be waiting. As others have said, you should generally avoid doing it.
Thread.Abort will do what you want, but it is not recommended to abort thread, better choose is to think a way for finishing threads correctly by Thread synchronization mechanism
Here's yet another way to politely signal a thread to die:
Note that this fashion favors finite state automatons where the slave periodically checks for permission to live, then performs a task if allowed. Tasks are not interrupted and are 'atomic'. This works great with simple loops or with command queues. Also this makes sure the thread doesn't spin 100% cpu by giving the slave thread a rest period, set this one to 0 if you don't want any rest in your slave.
var dieEvent = new AutoResetEvent(false);
int slaveRestPeriod = 20;// let's not hog the CPU with an endless loop
var master = new Thread(() =>
{
doStuffAMasterDoes(); // long running operation
dieEvent.Set(); // kill the slave
});
var slave = new Thread(() =>
{
while (!dieEvent.WaitOne(restPeriod))
{
doStuffASlaveDoes();
}
});
slave.Start();
master.Start();
Threads are created as peers, obtain a handle to Thread A and then call ThreadA.Abort()
to forcefully end it. It's better to check a boolean in the thread and if it evaluates to false exit the thread.
public class MyClass
{
public static Thread ThreadA;
public static Thread ThreadB;
private void RunThings()
{
ThreadA = new Thread(new ThreadStart(ThreadAWork));
ThreadB = new Thread(new ThreadStart(ThreadBWork));
ThreadA.Start();
ThreadB.Start();
}
static void ThreadAWork()
{
// do some stuff
// thread A will close now, all work is done.
}
static void ThreadBWork()
{
// do some stuff
ThreadA.Abort(); // close thread A
// thread B will close now, all work is done.
}
}