Issue with thread barrier - PulseAll not reaching all threads - c#

I have a parallel algorithm which I have some barrier issues with. Before y'all scream "search" I can say I have looked at available posts and links, and I have followed the instructions for a barrier with Monitor.Wait and Monitor.PulseAll, but my issue is that all threads except the last one created (and initiated) is reached by the PulseAll from my main thread. Here are how the basic layout of the code is:
public static object signal = new object(); //This one is located as class variable, not in the method
public void RunAlgorithm(List<City> cities){
List<Thread> localThreads = new List<Thread>();
object[] temp = //some parameters here
for(int i = 0; i < numOfCitiesToCheck; i++){
Thread newThread = new Thread((o) => DoWork(o as object[]));
newThread.IsBackground = true;
newThread.Priority = ThreadPriority.AboveNormal;
newThread.Start(temp as object);
localThreads.Add(newThread);
}
//All threads initiated, now we pulse all
lock(signal){
Monitor.PulseAll(signal);
}
int counter = 0;
while(true){
if(counter == localThreads.Count){ break; }
localThreads[counter].Join();
counter++;
}
}
That's what done by the main thread (removed a few uneccessary pieces) and as stated before the main thread will always get stuck in the Join() on the last thread in the list.
This is how the method of the threads look like:
private void DoWork(object[] arguments){
lock(signal){
Monitor.Wait(signal);
}
GoDoWork(arguments);
}
Are there any other barriers I can use for this type of signaling? All I want is to let the main thread signal all threads at the same time so that they start at the same time. I want them to start at the same time inorder to have as close parallel as possible (I measure running time for the algorithm and a few other things). Is there any part of my barrier or code that is flawed (with the barrier I mean)? I tried running an instance with fewer threads and it still get stuck on the last one, I don't know why it is. I have confirmed by via VS debug that the last thread is sleeping (all other threads are !isAlive, while the last one isAlive = true).
Any help appreciated!

I managed to solve it using the Barrier class. Many thanks to Damien_The_Unbeliever! Still can't believe I haven't heard of it before.
public Barrier barrier = new barrier(1);
public void RunAlgorithm(List<City> cities){
List<Thread> localThreads = new List<Thread>();
object[] temp = //some parameters here
for(int i = 0; i < numOfCitiesToCheck; i++){
barrier.AddParticipant();
Thread newThread = new Thread((o) => DoWork(o as object[]));
newThread.IsBackground = true;
newThread.Priority = ThreadPriority.AboveNormal;
newThread.Start(temp as object);
localThreads.Add(newThread);
}
barrier.SignalAndWait();
int counter = 0;
while(true){
if(counter == localThreads.Count){ break; }
localThreads[counter].Join();
counter++;
}
}
private void DoWork(object[] arguments){
barrier.SignalAndWait();
GoDoWork(arguments);
}

Related

Will all the worker threads get generated at once?

Is anyone out there who can explain me the flow of this code?
I wonder how main thread generates worker threads, what I know is:
As soon as main thread calls .start method it creates a new thread.
But I have a confusion how the behavior changes when it comes to looping multiple threads in main.
static void Main()
{
Thread[] tr = new Thread[10];
for (int i = 0; i < 10; i++)
{
tr[i] = new Thread(new ThreadStart(count));
tr[i].Start();
}
static private void count()
{
for (int i = 0; i < 10; ++i)
{
lock (theLock)
{
Console.WriteLine("Count {0} Thread{1}",
counter++, Thread.CurrentThread.GetHashCode());
}
}
Is there a good way to debug and track your multithreaded program. after google it out I found tracking thread window in debug mood, but I couldn't find it useful even after given custom names to threads.
I just can't understand the flow, how threads being launched, how they work all together etc as breakpoints seem no effect in multi-threaded application. (At least in my case.)
I want this output 1 printed by Thread : 4551 [ThreadID] 2 printed by
Thread : 4552 3 printed by Thread : 4553 4 printed by Thread : 4554 5
printed by Thread : 4555 6 printed by Thread : 4556 7 printed by
Thread : 4557 8 printed by Thread : 4558 9 printed by Thread : 4559 10
printed by Thread : 4560 11 printed by Thread : 4551 [ Same Thread Id
Appears again as in 1] 12 printed by Thread : 4552
I'll try to describe what your code is doing as it interacts with the threading subsystem. The details I'm giving are from what I remember from my OS design university classes, so the actual implementation in the host operating system and/or the CLR internals may vary a bit from what I describe.
static void Main()
{
Thread[] tr = new Thread[10];
for (int i = 0; i < 10; i++)
{
tr[i] = new Thread(new ThreadStart(count));
// The following line puts the thread in a "runnable" thread list that is
// managed by the OS scheduler. The scheduler will allow threads to run by
// considering many factors, such as how many processes are running on
// the system, how much time a runnable thread has been waiting, the process
// priority, the thread's priority, etc. This means you have little control
// on the order of execution, The only certain fact is that your thread will
// run, at some point in the near future.
tr[i].Start();
// At this point you are exiting your main function, so the program should
// end, however, since you didn't flag your threads as BackgroundThreads,
// the program will keep running until every thread finishes.
}
static private void count()
{
// The following loop is very short, and it is probable that the thread
// might finish before the scheduler allows another thread to run
// Like user2864740 suggested, increasing the amount of iterations will
// increase the chance that you experience interleaved execution between
// multiple threads
for (int i = 0; i < 10; ++i)
{
// Acquire a mutually-exclusive lock on theLock. Assuming that
// theLock has been declared static, then only a single thread will be
// allowed to execute the code guarded by the lock.
// Any running thread that tries to acquire the lock that is
// being held by a different thread will BLOCK. In this case, the
// blocking operation will do the following:
// 1. Register the thread that is about to be blocked in the
// lock's wait list (this is managed by a specialized class
// known as the Monitor)
// 2. Remove the thread that is about to be blocked from the scheduler's
// runnable list. This way the scheduler won't try to yield
// the CPU to a thread that is waiting for a lock to be
// released. This saves CPU cycles.
// 3. Yield execution (allow other threads to run)
lock (theLock)
{
// Only a single thread can run the following code
Console.WriteLine("Count {0} Thread{1}",
counter++, Thread.CurrentThread.GetHashCode());
}
// At this point the lock is released. The Monitor class will inspect
// the released lock's wait list. If any threads were waiting for the
// lock, one of them will be selected and returned to the scheduler's
// runnable list, where eventually it will be given the chance to run
// and contend for the lock. Again, many factors may be evaluated
// when selecting which blocked thread to return to the runnable
// list, so we can't make any guarantees on the order the threads
// are unblocked
}
}
Hopefully things are clearer. The important thing here is to acknowledge that you have little control of how individual threads are scheduled for execution, making it impossible (without a fair amount of synchronization code) to replicate the output you are expecting. At most, you can change a thread's priority to hint the scheduler that a certain thread must be favored over other threads. However, this needs to be done very carefully, as it may lead to a nasty problem known as priority inversion. Unless you know exactly what you are doing, it is usually better not to change a thread's priority.
After a continuous try, I got to complete the requirements of my task. Here is the code:
using System;
using System.Threading;
public class EntryPoint
{
static private int counter = 0;
static private object theLock = new Object();
static object obj = new object();
static private void count()
{
{
for (int i = 0; i < 10; i++)
{
lock (theLock)
{
Console.WriteLine("Count {0} Thread{1}",
counter++, Thread.CurrentThread.GetHashCode());
if (counter>=10)
Monitor.Pulse(theLock);
Monitor.Wait(theLock); } }}
}
static void Main()
{
Thread[] tr = new Thread[10];
for (int i = 0; i < 10; i++)
{
tr[i] = new Thread(new ThreadStart(count));
tr[i].Start();
}
}
}
Monitor maintains a ready queue in a sequential order hence I achieved what I wanted:
Cheers!

C# Threading program | Regarding thread.join

In the below program why the t.join functionality is not working. It will continue to type character O on the screen even when i have specified it to wait for the another thread to complete.
class Program
{
bool done;
static void Main(string[] args)
{
Thread t = new Thread(() => Go('U'));
for (int i = 0; i < 1000; i++)
{
Console.Write('O');
Thread.Sleep(500);
}
t.Start();
t.Join();
Console.WriteLine("Thread t has ended!");
Console.Read();
}
static void Go(char p)
{
for (int i = 0; i < 1000; i++)
{
Console.Write(p);
Thread.Sleep(500);
}
}
}
You never started the second thread - you're printing out Os, but you only start the second thread after you're done with that (in about eight minutes).
Move the t.Start(); before the loop, and it should work the way you expect it to work.
Don't guess around with multi-threading - it's incredibly easy to get subtly wrong. Learn what a thread is, what does Join do, and how to use multiple threads safely. Otherwise, you'll have a lot of fun debugging issues that are near impossible to reproduce and fix :)

WaitHandle fundamental behavior

Have those two code blocks the same effect when looking at the console?
Please note: Currently I am still using and bound to .NET 3.5.
First:
for(int i = 0; i<3;i++)
{
Console.WriteLine(i);
}
Second:
class Worker
{
static int i = 0;
static ManualResetEvent manualResetEvent = new ManualResetEvent(false);
static Object locky = new Object();
static void Work(Object workItem)
{
WaitHandle[] wait = new [] { manualResetEvent };
while (WaitHandle.WaitAny(wait))
{
lock (locky)
{
Console.WriteLine(i++);
}
}
}
}
// main:
Thread thread = new Thread(Worker.Work);
thread.Start();
for (int i=0;i<3;i++)
{
Worker.manualResetEvent.Set();
}
Will the waitHandle increase with every signal? Will the loop run until all signals are done?
Or will a signal be ignored when the thread is already working?
Can someone please bring some light into this?
Since you're using a ManualResetEvent, once you signal the event, it remains signaled until it's reset. Which means setting it once or three times will have the same effect.
This also means that the worker will go into an infinite loop because the event is never reset.
Also, you can't lock on value types. If you could, the int would be boxed and create a new object every time you lock on it - which means you'd be locking on a different object every single time, rendering the lock useless.

Implementing a thread queue/wait, how?

I have a timer calling a function every 15 minutes, this function counts the amount of lines in my DGV and starts a thread for each lines (of yet another function), said thread parse a web page which can take anywhere from 1 second to 10 second to finish.
Whilst it does work fine as it is with 1-6 rows, anymore will cause the requests to time-out.
I want it to wait for the newly created thread to finish processing before getting back in the loop to create another thread without locking the main UI
for (int x = 0; x <= dataGridFollow.Rows.Count - 1; x++)
{
string getID = dataGridFollow.Rows[x].Cells["ID"].Value.ToString();
int ID = int.Parse(getID);
Thread t = new Thread(new ParameterizedThreadStart(UpdateLo));
t.Start(ID);
// <- Wait for thread to finish here before getting back in the for loop
}
I have googled a lot in the past 24 hours, read a lot about this specific issue and its implementations (Thread.Join, ThreadPools, Queuing, and even SmartThreadPool).
It's likely that I've read the correct answer somewhere but I'm not at ease enough with C# to decypher those Threading tools
Thanks for your time
to avoid the UI freeze the framework provide a class expressly for these purposes: have a look at the BackgroundWorker class (executes an operation on a separate thread), here's some infos : http://msdn.microsoft.com/en-us/library/system.componentmodel.backgroundworker.aspx
http://msdn.microsoft.com/en-us/magazine/cc300429.aspx
Btw looks if I understand correctly you don't want to parallelize any operation so just wait for the method parsing the page to be completed. Basically for each (foreach look) row of your grid you get the id and call the method. If you want to go parallel just reuse the same foreach loop and add make it Parallel
http://msdn.microsoft.com/en-us/library/dd460720.aspx
What you want is to set off a few workers that do some task.
When one finishes you can start a new one off.
I'm sure there is a better way using thread pools or whatever.. but I was bored so i came up with this.
using System;
using System.Collections.Generic;
using System.Linq;
using System.ComponentModel;
using System.Threading;
namespace WorkerTest
{
class Program
{
static void Main(string[] args)
{
WorkerGroup workerGroup = new WorkerGroup();
Console.WriteLine("Starting...");
for (int i = 0; i < 100; i++)
{
var work = new Action(() =>
{
Thread.Sleep(1000); //somework
});
workerGroup.AddWork(work);
}
while (workerGroup.WorkCount > 0)
{
Console.WriteLine(workerGroup.WorkCount);
Thread.Sleep(1000);
}
Console.WriteLine("Fin");
Console.ReadLine();
}
}
public class WorkerGroup
{
private List<Worker> workers;
private Queue<Action> workToDo;
private object Lock = new object();
public int WorkCount { get { return workToDo.Count; } }
public WorkerGroup()
{
workers = new List<Worker>();
workers.Add(new Worker());
workers.Add(new Worker());
foreach (var w in workers)
{
w.WorkCompleted += (OnWorkCompleted);
}
workToDo = new Queue<Action>();
}
private void OnWorkCompleted(object sender, EventArgs e)
{
FindWork();
}
public void AddWork(Action work)
{
workToDo.Enqueue(work);
FindWork();
}
private void FindWork()
{
lock (Lock)
{
if (workToDo.Count > 0)
{
var availableWorker = workers.FirstOrDefault(x => !x.IsBusy);
if (availableWorker != null)
{
var work = workToDo.Dequeue();
availableWorker.StartWork(work);
}
}
}
}
}
public class Worker
{
private BackgroundWorker worker;
private Action work;
public bool IsBusy { get { return worker.IsBusy; } }
public event EventHandler WorkCompleted;
public Worker()
{
worker = new BackgroundWorker();
worker.DoWork += new DoWorkEventHandler(OnWorkerDoWork);
worker.RunWorkerCompleted += new RunWorkerCompletedEventHandler(OnWorkerRunWorkerCompleted);
}
private void OnWorkerRunWorkerCompleted(object sender, RunWorkerCompletedEventArgs e)
{
if (WorkCompleted != null)
{
WorkCompleted(this, EventArgs.Empty);
}
}
public void StartWork(Action work)
{
if (!IsBusy)
{
this.work = work;
worker.RunWorkerAsync();
}
else
{
throw new InvalidOperationException("Worker is busy");
}
}
private void OnWorkerDoWork(object sender, DoWorkEventArgs e)
{
work.Invoke();
work = null;
}
}
}
This would be just a starting point.
You could start it off with a list of Actions and then have a completed event for when that group of actions is finished.
then at least you can use a ManualResetEvent to wait for the completed event.. or whatever logic you want really.
Call a method directly or do a while loop (with sleep calls) to check the status of the thread.
There are also async events but the would call another method, and you want to continue from the same point.
I have no idea why the requests would timeout. That sounds like a different issue. However, I can make a few suggestions regarding your current approach.
Avoid creating threads in loops with nondeterministic bounds. There is a lot of overhead in creating threads. If the number of operations is not known before hand then use the ThreadPool or the Task Parallel Library instead.
You are not going to get the behavior you want by blocking the UI thread with Thread.Join. The cause the UI to become unresponsive and it will effectively serialize the operations and cancel out any advantage you were hoping to gain with threads.
If you really want to limit the number of concurrent operations then a better solution is to create a separate dedicated thread for kicking off the operations. This thread will spin around a loop indefinitely waiting for items to appear in a queue and when they do it will dequeue them and use that information to kick off an operation asynchronously (again using the ThreadPool or TPL). The dequeueing thread can contain the logic for limiting the number of concurrent operations. Search for information regarding the producer-consumer pattern to get a better understand of how you can implement this.
There is a bit of a learning curve, but who said threading was easy right?
If I understand correctly, what you're currently doing is looping through a list of IDs in the UI thread, starting a new thread to handle each one. The blocking issue you're seeing then could well be that it's taking too many resources to create unique threads. So, personally (without knowing more) would redesign the process like so:
//Somewhere in the UI Thread
Thread worker = new Thread(new ParameterizedThreadStart(UpdateLoWorker));
worker.Start(dataGridFollow.Rows);
//worker thread
private void UpdateLoWorker(DataRowCollection rows)
{
foreach(DataRow r in rows){
string getID = r.Cells["ID"].Value.ToString();
int ID = int.Parse(getID);
UpdateLo(ID);
}
}
Here you'd have a single non-blocking worker which sequentially handles each ID.
Consider using Asynchronous CTP. It's an asynch pattern Microsoft recently released for download. It should simplify asynch programming tremendouesly. The link is http://msdn.microsoft.com/en-us/vstudio/async.aspx. (Read the whitepaper first)
Your code would look something like the following. (I've not verified my syntax yet, sorry).
private async Task DoTheWork()
{
for(int x = 0; x <= dataGridFollow.Rows.Count - 1; x++)
{
string getID = dataGridFollow.Rows[x].Cells["ID"].Value.ToString();
int ID = int.Parse(getID);
task t = new Task(new Action<object>(UpdateLo), ID);
t.Start();
await t;
}
}
This method returns a Task that can be checked periodically for completion. This follows the pattern of "fire and forget" meaning you just call it and presumably, you don't care when it completes (as long as it does complete before 15 minutes).
EDIT
I corrected the syntax above, you would need to change UpdateLo to take an object instead of an Int.
For a simple background thread runner that will run one thread from a queue at a time you can do something like this:
private List<Thread> mThreads = new List<Thread>();
public static void Main()
{
Thread t = new Thread(ThreadMonitor);
t.IsBackground = true;
t.Start();
}
private static void ThreadMonitor()
{
while (true)
{
foreach (Thread t in mThreads.ToArray())
{
// Runs one thread in the queue and waits for it to finish
t.Start();
mThreads.Remove(t);
t.Join();
}
Thread.Sleep(2000); // Wait before checking for new threads
}
}
// Called from the UI or elsewhere to create any number of new threads to run
public static void DoStuff()
{
Thread t = new Thread(DoCorestuff);
t.IsBackground = true;
mActiveThreads.Add(t);
}
public static void DoStuffCore()
{
// Your code here
}

Race condition during thread start?

I'm running the following code to start my threads, but they don't start as intended. For some reason, some of the threads start with the same objects (and some don't even start). If I try to debug, they start just fine (extra delay added by me clicking F10 to step through the code).
These are the functions in my forms app:
private void startWorkerThreads()
{
int numThreads = config.getAllItems().Count;
int i = 0;
foreach (ConfigurationItem tmpItem in config.getAllItems())
{
i++;
var t = new Thread(() => WorkerThread(tmpItem, i));
t.Start();
//return t;
}
}
private void WorkerThread(ConfigurationItem cfgItem, int mul)
{
for (int i = 0; i < 100; i++)
{
Thread.Sleep(10*mul);
}
this.Invoke((ThreadStart)delegate()
{
this.textBox1.Text += "Thread " + cfgItem.name + " Complete!\r\n";
this.textBox1.SelectionStart = textBox1.Text.Length;
this.textBox1.ScrollToCaret();
});
}
Anyone able to help me out?
Starting a thread doesn't really start the thread. Instead it schedules it for execution. I.e. at some point it will get to run when it is scheduled. Scheduling threads is a complex topic and an implementation detail of the OS, so your code should not expect a certain scheduling.
You're also capturing variables in your lambda. Please see this post (there is a section on Captured Variables) for the problems associated with doing that.
You just run into the (be me called) lambda error.
You provide the ConfigurationItem from the foreach loop directly. This leads to the fact, that all your threads get the same item (the last one).
To get this to work you have to create a reference for each item and apply this to each thread:
foreach (ConfigurationItem tmpItem in config.getAllItems())
{
i++;
var currentI = i;
var currentItem = tmpItem;
var t = new Thread(() => WorkerThread(currentItem, currentI));
t.Start();
//return t;
}
And you should also consider using a ThreadPool.
MSDN Description about how to use the ThreadPool
Short summary of differences here on SO
The problem seems to be there : () => WorkerThread(tmpItem, i)
I'm not used to Func<> but it seems to work like anonymous delegates in .NET 2.0. Thus, you may have a reference to the arguments of the WorkerThread() method. Hence, their values are retrieved later (when the thread actually runs).
In this case, you may already be at the next iteration of your main thread...
Try this instead :
var t = new Thread(new ParametrizedThreadStart(WorkerThread));
t.Start(new { ConfigurationItem = tmpItem, Index = i } );
[EDIT] Other implementation. More flexible if you need to pass new parameters to the thread in the future.
private void startWorkerThreads()
{
int numThreads = config.getAllItems().Count;
int i = 0;
foreach (ConfigurationItem tmpItem in config.getAllItems())
{
i++;
var wt = new WorkerThread(tmpItem, i);
wt.Start();
//return t;
}
}
private class WorkerThread
{
private ConfigurationItem _cfgItem;
private int _mul;
private Thread _thread;
public WorkerThread(ConfigurationItem cfgItem, int mul) {
_cfgItem = cfgItem;
_mul = mul;
}
public void Start()
{
_thread = new Thread(Run);
_thread.Start();
}
private void Run()
{
for (int i = 0; i < 100; i++)
{
Thread.Sleep(10 * _mul);
}
this.Invoke((ThreadStart)delegate()
{
this.textBox1.Text += "Thread " + _cfgItem.name + " Complete!\r\n";
this.textBox1.SelectionStart = textBox1.Text.Length;
this.textBox1.ScrollToCaret();
});
}
}
Do you really need to spawn threads manually (which is a rather expensive task) ? You could try to switch to the ThreadPool instead.
You can't assume that the threads will run in the same order they were called, unless you force it, and cause a dependency between them.
So the real question is - what is your goal ?
I think that the error is somewhere else. Here are some hints to help you debug :
Give a name containing to each thread, and display the thread name instead of the config item name :
this.textBox1.Text += "Thread " + Thread.Current.Name + " Complete!\r\n";
Display the content of config.getAllItems(), may be that some items has the same name (duplicated)
===========
Here are some additional information about multi threading with winforms:
dont create new Thread directly, use the ThreadPool instead :
ThreadPool.QueueUserWorkItem(state => { WorkerThread(tmpItem, i); });
If you really want to creat your threads, use this.BeginInvoke instead of this.Invoke your worker thread will finish sooner => less concurrent thread => better global performance
don't call Thread.Sleep in a loop, just do a big sleep: Thread.Sleep(10*mul*100);
I hope that this will help you.
Thanks to all of you!
I just implemented the threadpool, and that worked like a charm - with the added bonus of not spawning too many threads at once.
I'll have a look at the other solutions, too, but this time around the threadpool will save me from having to manually check for bozos with too many configs ;)

Categories

Resources