I'm doing some C# threading. No problems starting the threads and transferring data to them, but I have a problem with waiting for them to end.
My code is shown below. I'm using Join() to wait for the threads to end, but for some reason my code doesn't work.
The main thread (i.e. the for loop) isn't blocked despite calling Join() on all the active threads.
Any idea what I'm doing wrong?
List<Thread> calculationThreads = new List<Thread>();
foreach (string calculation in calculations)
{
if (calculationThreads.Count < 5)
{
Thread calculationThread = new Thread(DoCalculation);
calculationThreads.Add(calculationThread);
calculationThread.Start(threadData);
}
else
{
// Wait for the threads to complete
foreach (Thread calculationThread in calculationThreads)
{
calculationThread.Join();
}
}
}
The first problem is your handling of the else case. If there is already five threads the code will wait for the threads to finish, but the task that it was trying to add is never added. It will just throw away that task and go on to the next.
The second problem is that you don't remove any threads from the list, so once it reaches five threads, it will wait forever. If the first problem didn't discard the rest of the tasks, your program would just lock up.
Also, you are wasting processing time by waiting for all five threads to finish before continuing the work, but that's a smaller problem.
I would go for some approach where I just calculate how many threads I've started and in the end of each thread I decrease the counter.
Then in the beginning of your loop you can have
while(counter >= 5)
{
//Wait
}
You can have a while loop that waits for all the threads to end.
List<Thread> calculationThreads = new List<Thread>();
foreach (string calculation in calculations)
{
if (calculationThreads.Count < 5)
{
Thread calculationThread = new Thread(DoCalculation);
calculationThreads.Add(calculationThread);
calculationThread.Start(threadData);
}
else
{
// Wait for the threads to complete
while(calculationThread.Any(x => x.IsAlive)){}
// Clearing the list
calculationThread.Clear();
}
}
If you want to keep the threads for after the for loop you should have another list for storing the threads.
How many calculations are you providing to the method?
Reading the code, if you provide 4 calculations you'll start 4 threads but never actually go to the code where you do a thread.Join().
Move the thread.join loop outside the if else statement.
List<Thread> calculationThreads = new List<Thread>();
foreach (string calculation in calculations)
{
if (calculationThreads.Count < 5)
{
Thread calculationThread = new Thread(DoCalculation);
calculationThreads.Add(calculationThread);
calculationThread.Start(threadData);
}
}
foreach (Thread calculationThread in calculationThreads)
{
calculationThread.Join();
}
Related
Is anyone out there who can explain me the flow of this code?
I wonder how main thread generates worker threads, what I know is:
As soon as main thread calls .start method it creates a new thread.
But I have a confusion how the behavior changes when it comes to looping multiple threads in main.
static void Main()
{
Thread[] tr = new Thread[10];
for (int i = 0; i < 10; i++)
{
tr[i] = new Thread(new ThreadStart(count));
tr[i].Start();
}
static private void count()
{
for (int i = 0; i < 10; ++i)
{
lock (theLock)
{
Console.WriteLine("Count {0} Thread{1}",
counter++, Thread.CurrentThread.GetHashCode());
}
}
Is there a good way to debug and track your multithreaded program. after google it out I found tracking thread window in debug mood, but I couldn't find it useful even after given custom names to threads.
I just can't understand the flow, how threads being launched, how they work all together etc as breakpoints seem no effect in multi-threaded application. (At least in my case.)
I want this output 1 printed by Thread : 4551 [ThreadID] 2 printed by
Thread : 4552 3 printed by Thread : 4553 4 printed by Thread : 4554 5
printed by Thread : 4555 6 printed by Thread : 4556 7 printed by
Thread : 4557 8 printed by Thread : 4558 9 printed by Thread : 4559 10
printed by Thread : 4560 11 printed by Thread : 4551 [ Same Thread Id
Appears again as in 1] 12 printed by Thread : 4552
I'll try to describe what your code is doing as it interacts with the threading subsystem. The details I'm giving are from what I remember from my OS design university classes, so the actual implementation in the host operating system and/or the CLR internals may vary a bit from what I describe.
static void Main()
{
Thread[] tr = new Thread[10];
for (int i = 0; i < 10; i++)
{
tr[i] = new Thread(new ThreadStart(count));
// The following line puts the thread in a "runnable" thread list that is
// managed by the OS scheduler. The scheduler will allow threads to run by
// considering many factors, such as how many processes are running on
// the system, how much time a runnable thread has been waiting, the process
// priority, the thread's priority, etc. This means you have little control
// on the order of execution, The only certain fact is that your thread will
// run, at some point in the near future.
tr[i].Start();
// At this point you are exiting your main function, so the program should
// end, however, since you didn't flag your threads as BackgroundThreads,
// the program will keep running until every thread finishes.
}
static private void count()
{
// The following loop is very short, and it is probable that the thread
// might finish before the scheduler allows another thread to run
// Like user2864740 suggested, increasing the amount of iterations will
// increase the chance that you experience interleaved execution between
// multiple threads
for (int i = 0; i < 10; ++i)
{
// Acquire a mutually-exclusive lock on theLock. Assuming that
// theLock has been declared static, then only a single thread will be
// allowed to execute the code guarded by the lock.
// Any running thread that tries to acquire the lock that is
// being held by a different thread will BLOCK. In this case, the
// blocking operation will do the following:
// 1. Register the thread that is about to be blocked in the
// lock's wait list (this is managed by a specialized class
// known as the Monitor)
// 2. Remove the thread that is about to be blocked from the scheduler's
// runnable list. This way the scheduler won't try to yield
// the CPU to a thread that is waiting for a lock to be
// released. This saves CPU cycles.
// 3. Yield execution (allow other threads to run)
lock (theLock)
{
// Only a single thread can run the following code
Console.WriteLine("Count {0} Thread{1}",
counter++, Thread.CurrentThread.GetHashCode());
}
// At this point the lock is released. The Monitor class will inspect
// the released lock's wait list. If any threads were waiting for the
// lock, one of them will be selected and returned to the scheduler's
// runnable list, where eventually it will be given the chance to run
// and contend for the lock. Again, many factors may be evaluated
// when selecting which blocked thread to return to the runnable
// list, so we can't make any guarantees on the order the threads
// are unblocked
}
}
Hopefully things are clearer. The important thing here is to acknowledge that you have little control of how individual threads are scheduled for execution, making it impossible (without a fair amount of synchronization code) to replicate the output you are expecting. At most, you can change a thread's priority to hint the scheduler that a certain thread must be favored over other threads. However, this needs to be done very carefully, as it may lead to a nasty problem known as priority inversion. Unless you know exactly what you are doing, it is usually better not to change a thread's priority.
After a continuous try, I got to complete the requirements of my task. Here is the code:
using System;
using System.Threading;
public class EntryPoint
{
static private int counter = 0;
static private object theLock = new Object();
static object obj = new object();
static private void count()
{
{
for (int i = 0; i < 10; i++)
{
lock (theLock)
{
Console.WriteLine("Count {0} Thread{1}",
counter++, Thread.CurrentThread.GetHashCode());
if (counter>=10)
Monitor.Pulse(theLock);
Monitor.Wait(theLock); } }}
}
static void Main()
{
Thread[] tr = new Thread[10];
for (int i = 0; i < 10; i++)
{
tr[i] = new Thread(new ThreadStart(count));
tr[i].Start();
}
}
}
Monitor maintains a ready queue in a sequential order hence I achieved what I wanted:
Cheers!
I've use the below code to implement and test a blocking queue. I test the queue by starting up 5 concurrent threads (the removers) to pull items off the queue, blocking if the queue is empty and 1 concurrent thread (the adder) to add items to the queue intermitently. However, if I leave it running for long enough I get an exception because one of the remover threads comes out of a waiting state even when the queue is empty.
Does anyone know why I get the exception? Note, I'm interested in knowing why this doesn't work as opposed to a working solution (as I can just Google that).
I'd greatly appreciate your help.
using System;
using System.Threading;
using System.Collections.Generic;
namespace Code
{
class Queue<T>
{
private List<T> q = new List<T>();
public void Add(T item)
{
lock (q)
{
q.Add(item);
if (q.Count == 1)
{
Monitor.Pulse(q);
}
}
}
public T Remove()
{
lock (q)
{
if (q.Count == 0)
{
Monitor.Wait(q);
}
T item = q[q.Count - 1];
q.RemoveAt(q.Count - 1);
return item;
}
}
}
class Program
{
static Random r = new Random();
static Queue<int> q = new Queue<int>();
static int count = 1;
static void Adder()
{
while (true)
{
Thread.Sleep(1000 * ((r.Next() % 5) + 1));
Console.WriteLine("Will try to add");
q.Add(count++);
}
}
static void Remover()
{
while (true)
{
Thread.Sleep(1000 * ((r.Next() % 5) + 1));
Console.WriteLine("Will try to remove");
int item = q.Remove();
Console.WriteLine("Removed " + item);
}
}
static void Main(string[] args)
{
Console.WriteLine("Test");
for (int i = 0; i < 5; i++)
{
Thread remover = new Thread(Remover);
remover.Start();
}
Thread adder = new Thread(Adder);
adder.Start();
}
}
}
if I leave it running for long enough I get an exception because one of the remover threads comes out of a waiting state even when the queue is empty. Does anyone know why I get the exception?
The question is odd, because obviously you know the answer: your first sentence answers the question asked by the second sentence. You get the exception because a remover thread comes out of the wait state when the queue is empty.
To solve the problem you'll want to use a loop instead of an "if". The correct code is:
while(q.Count == 0) Monitor.Wait(q);
not
if(q.Count == 0) Monitor.Wait(q);
UPDATE:
A commenter points out that perhaps your question was intended to be "under what circumstances can a consumer thread obtain the monitor when the queue is empty?"
Well, you are in a better position to answer that than we are, since you're the one running the program and looking at the output. But just off the top of my head, here's a way that could happen:
Consumer Thread 1: waiting
Consumer Thread 2: ready
Producer Thread 3: owns the monitor
There is one element in the queue.
Thread 3 pulses.
Thread 1 goes to ready state.
Thread 3 abandons the monitor.
Thread 2 enters the monitor.
Thread 2 consumes the item in the queue
Thread 2 abandons the monitor.
Thread 1 enters the monitor.
And now thread 1 is in the monitor with an empty queue.
Generally speaking when reasoning about these sorts of problems you should think of "Pulse" as being like a pigeon with a note attached to it. Once released it has no connection to the sender, and if it cannot find its home, it dies in the wilderness with its message undelivered. All you know when you Pulse is that if there is any thread waiting then one thread will move to the ready state at some time in the future; you don't know anything else about the relative timing of operations on threads.
Your code would work if there was 1 consumer but when there are more, this mechanism fails and it should be while(q.Count == 0) Monitor.Wait(q)
The following scenario shows when if(q.Count == 0) Monitor.Wait(q) would fail (it's different than Eric's):
consumer 1 is waiting
producer has put in an item and is pulsing
consumer 1 is ready
producer is releasing lock
consumer 2 just entered Remove, is lucky and acquires lock
consumer 2 sees 1 item, does not wait and takes item out
consumer 2 releases lock
consumer 1 re-acquires lock but queue is empty
This happens exactly as documentation says it can happen:
When the thread that invoked Pulse releases the lock, the next thread in the ready queue (which is not necessarily the thread that was pulsed) acquires the lock.
Eric is of course right; the fact is that while the code appears to cover all the bases; the fact that an exception occurs shows that you haven't.
The race condition is that between the Monitor.Wait on a remover and a Monitor.Pulse on the adder (which releases the lock; but doesn't necessarily immediately trigger a thread waiting to wake up and reacquire it); a subsequent remove thread can acquire the lock and immediately jump the
if (q.Count == 0)
{
Monitor.Wait(q);
}
Statement and go straight to removing the item. Then, the Pulsed thread wakes up and assumes there's an item still there; but there isn't.
The way to fix it, whatever the way the race condition is actually manifesting, is as Eric has said.
Equally if you read the example on Monitor.Pulse you'll see a similar setup to what you have done here but a subtlely different way of doing it.
when parent thread sleep does sub threads also sleep ?
Now main thread is UI
I create 20 sub threads inside main thread with task factory (lets call threads 2)
Inside of this 20 sub threads i create another 10 sub threads again with sub factory (lets call threads 3)
Now inside of this threads 2 i have infinite loop. Inside of infinite loop checking whether threads 3 completed or not. If completed dispose completed thread and start another thread. I am using 250 ms sleep for each checking inside infinite while loop. So when threads 2 in sleep does also threads 3 sleep or they are independent. Here the code you can see.
while (true)
{
int irActiveThreadCount = 0;
int irFinishedLast = -1;
for (int i = 0; i < irPerMainSiteThreadCount; i++)
{
if (MainSitesTaskList[irWhichMainTask, i] == null)
{
irFinishedLast = i;
break;
}
if (MainSitesTaskList[irWhichMainTask, i].IsCompleted == true)
{
irFinishedLast = i;
break;
}
}
for (int i = 0; i < irPerMainSiteThreadCount; i++)
{
if (MainSitesTaskList[irWhichMainTask, i] != null)
if (MainSitesTaskList[irWhichMainTask, i].IsCompleted == false)
{
irActiveThreadCount++;
}
}
if (irFinishedLast > -1)
{
var newTask = Task.Factory.StartNew(() =>
{
fcStartSubPageCrawl(srMainSiteURL, srMainSiteId, irWhichMainTask);
});
lock (lockerMainSitesArray)
{
if (MainSitesTaskList[irWhichMainTask, irFinishedLast] != null)
MainSitesTaskList[irWhichMainTask, irFinishedLast].Dispose();
MainSitesTaskList[irWhichMainTask, irFinishedLast] = newTask;
}
}
Thread.Sleep(250);
srQuery = "myquery";
using (DataSet dsTemp = DbConnection.db_Select_Query(srQuery))
{
if (dsTemp != null)
if (dsTemp.Tables.Count > 0)
if (dsTemp.Tables[0].Rows.Count == 0)
{
break;
}
}
}
There's no such thing as a "parent" thread really. One thread starts another, but then there's no particular relationship between them. For example, the starting thread can terminate without any of the new threads dying.
The starting thread sleeping definitely doesn't affect any other thread.
There is no concept of parent and child threads. One implication of this is that the child threads don't sleep when the parent thread sleeps.
Thread.Sleep(...)
only suspends the current Thread.
check here: Thread.Sleep Method
so all other threads will keep working.
Each thread is always totally independant. The only possible connection between threads is that when all the non-background thread finish, the program ends, so the background threads die. If a thread sleeps, the other threads continue working (and probably go faster, because there is one less thread working). If you need to sync threads, there are various classes to do it (in general locks (not a class), mutexes, semaphores...)
The others are right, there is no concept of “parent threads” in .Net. And waiting on one thread doesn't cause other threads to wait (unless there is some synchronization involved, like using locks).
But there's another point: your code doesn't create new threads, at least not necessarily. When you call Task.Factory.StartNew(), the task is usually scheduled on a thread pool thread. If there isn't any thread available and the number of threads didn't reach the maximum allowed number yet, new thread is created. But in other cases, it isn't. The task is either going to reuse existing idle thread, or it's going to wait, until one becomes available.
I have a list of threads, and I'm trying to get my main thread to wait for all of the threads in the list to terminate:
while (consumers.Count > 0) consumers[0].Join();
The problem is, this isn't atomic, and I can get an "Index was out of range" exception.
Is there any atomic way to check for the existence of consumers[0] and call consumers[0].Join()?
Note: I can't do
lock (myLocker) { if (consumers.Count > 0) consumers[0].Join(); }
because I don't want to block the other threads from accessing consumers while stuck in Join().
Well if you don't have any synchronization applied at all but your list is being modified from multiple threads, you're in trouble already. Does everything else use myLocker? Assuming it does, how about:
while(true)
{
List<Thread> copy;
lock (myLocker)
{
copy = new List<Thread>(consumers);
}
if (copy.Count == 0)
{
break;
}
foreach (Thread thread in copy)
{
thread.Join();
}
}
Note that this only accesses consumers while holding the lock, which is what you should do everywhere to achieve thread safety. It also calls Join on all the threads after taking a copy, rather than just doing one for each iteration.
If you know that threads won't be added to the list at this point (e.g. it's a thread pool which is draining) you can remove the loop and the check:
List<Thread> copy;
lock (myLocker)
{
copy = new List<Thread>(consumers);
}
foreach (Thread thread in copy)
{
thread.Join();
}
Assuming consumer[0] is only null or an instance of the class with the expected method, you could just do
while (consumers.Count > 0)
{
if (consumers[0] != null)
{
consumers[0].Join();
}
}
In your scenario, when you want to just wait for threads, why assume, that those threads themselves should remove from the list? Make sure, that only main thread can remove a thread from the list, after this thread has terminated, it won't change your design and you can be sure that you won't access element that it's not there. Your loop would look like this:
for (int i = 0; i < consumers.Count; ++i)
consumers[i].Join();
//Now that you have joined everyone, just remove reference to your list
consumers = null;
In my current C#/NET 3.5 application, I have a task queue (thread safe) and I have 5 worker threads that has to constantly look for tasks in the queue. If a task is available, any one worker will dequeue the task and take required action.
My worker thread class is as follows:
public class WorkerThread
{
//ConcurrentQueue is my implementation of thread safe queue
//Essentially just a wrapper around Queue<T> with synchronization locks
readonly ConcurrentQueue<CheckPrimeTask> mQ;
readonly Thread mWorker;
bool mStop;
public WorkerThread (ConcurrentQueue<CheckPrimeTask> aQ) {
mQ = aQ;
mWorker = new Thread (Work) {IsBackground = true};
mStop = false;
}
private void Work () {
while (!mStop) {
if (mQ.Count == 0) {
Thread.Sleep (0);
continue;
}
var task = mQ.Dequeue ();
//Someone else might have been lucky in stealing
//the task by the time we dequeued it!!
if (task == null)
continue;
task.IsPrime = IsPrime (task.Number);
task.ExecutedBy = Thread.CurrentThread.ManagedThreadId;
//Ask the threadpool to execute the task callback to
//notify completion
ThreadPool.QueueUserWorkItem (task.CallBack, task);
}
}
private bool IsPrime (int number) {
int limit = Convert.ToInt32 (Math.Sqrt (number));
for (int i = 2; i <= limit; i++) {
if (number % i == 0)
return false;
}
return true;
}
public void Start () {
mStop = false;
mWorker.Start ();
}
public void Stop () {
mStop = true;
}
}
Problem is that when queue is empty, it consumes too much CPU (nearly 98%). I tried AutoResetEvent to notify the workers that queue has been changed. So they effectively wait for that signal to set. It has braught down the CPU to nearly 0% but I am not entirely sure whether this is the best method. Can you suggest a better method to keep the threads idle without hurting CPU usage?
Check out this implementation of a BlockingQueue. If the queue is empty, it uses Monitor.Wait() to put the thread to sleep. When an item is added, it uses Monitor.Pulse() to wake up a thread that is sleeping on the empty queue.
Another technique is to use a semaphore. Each time you add an item to a queue, call Release(). When you want an item from a queue, call WaitOne().
You currently have Thread.Sleep(0) in your Work method for where there are no queue items. Change it to anything greater than 0 and your CPU use will go down. Try 10 to start with...
You have a couple of options that I can think of.
One way is to place a small thread sleep during your loop. This will basically drop your CPU usage to 0 and is fairly standard way of doing this.
Another way is to use a reset (either auto or manual) as suggested by Mitch Wheat in the comments.
You could also devise some kind of IdleTask that has a thread sleep for a certain amount of time and if your queue is empty, just process the IdleTask (which will thread sleep).
If your Queue is thread safe then you would not need to do this...
//Someone else might have been lucky in stealing
//the task by the time we dequeued it!!
if (task == null)
continue;