I want to achieve the below requirement; please suggest some solution.
string[] filenames = Directory.GetFiles("C:\Temp"); //10 files
for (int i = 0; i < filenames.count; i++)
{
ProcessFile(filenames[i]); //it takes time to execute
}
I wanted to implement multi-threading. e.g There are 10 files. I wanted to process 3 files at a time (configurable, say maxthreadcount). So 3 files will be processed in 3 threads from the for loop and if any thread completes the execution, it should pick the next item from the for loop. Also wanted to ensure all the files are processed before it exits the for loop.
Please suggest best approach.
Try
Parallel.For(0, filenames.Length, i => {
ProcessFile(filenames[i]);
});
MSDN
It's only available since .Net 4. Hope that acceptable.
This will do the job in .net 2.0:
class Program
{
static int workingCounter = 0;
static int workingLimit = 10;
static int processedCounter = 0;
static void Main(string[] args)
{
string[] files = Directory.GetFiles("C:\\Temp");
int checkCount = files.Length;
foreach (string file in files)
{
//wait for free limit...
while (workingCounter >= workingLimit)
{
Thread.Sleep(100);
}
workingCounter += 1;
ParameterizedThreadStart pts = new ParameterizedThreadStart(ProcessFile);
Thread th = new Thread(pts);
th.Start(file);
}
//wait for all threads to complete...
while (processedCounter< checkCount)
{
Thread.Sleep(100);
}
Console.WriteLine("Work completed!");
}
static void ProcessFile(object file)
{
try
{
Console.WriteLine(DateTime.Now.ToString() + " recieved: " + file + " thread count is: " + workingCounter.ToString());
//make some sleep for demo...
Thread.Sleep(2000);
}
catch (Exception ex)
{
//handle your exception...
string exMsg = ex.Message;
}
finally
{
Interlocked.Decrement(ref workingCounter);
Interlocked.Increment(ref processedCounter);
}
}
}
Take a look at the Producer/Consumer Queue example by Joe Albahari. It should provide a good starting point for what you're trying to accomplish.
You could use the ThreadPool.
Example:
ThreadPool.SetMaxThreads(3, 3);
for (int i = 0; i < filenames.count; i++)
{
ThreadPool.QueueUserWorkItem(new WaitCallback(ProcessFile), filenames[i]);
}
static void ProcessFile(object fileNameObj)
{
var fileName = (string)fileNameObj;
// do your processing here.
}
If you are using the ThreadPool elsewhere in your application then this would not be a good solution since it is shared across your app.
You could also grab a different thread pool implementation, for example SmartThreadPool
Rather than starting a thread for each file name, put the file names into a queue and then start up three threads to process them. Or, since the main thread is now free, start up two threads and let the main thread work on it, too:
Queue<string> MyQueue;
void MyProc()
{
string[] filenames = Directory.GetFiles(...);
MyQueue = new Queue(filenames);
// start two threads
Thread t1 = new Thread((ThreadStart)ProcessQueue);
Thread t2 = new Thread((ThreadStart)ProcessQueue);
t1.Start();
t2.Start();
// main thread processes the queue, too!
ProcessQueue();
// wait for threads to complete
t1.Join();
t2.Join();
}
private object queueLock = new object();
void ProcessQueue()
{
while (true)
{
string s;
lock (queueLock)
{
if (MyQueue.Count == 0)
{
// queue is empty
return;
}
s = MyQueue.Dequeue();
}
ProcessFile(s);
}
}
Another option is to use a semaphore to control how many threads are working:
Semaphore MySem = new Semaphore(3, 3);
void MyProc()
{
string[] filenames = Directory.GetFiles(...);
foreach (string s in filenames)
{
mySem.WaitOne();
ThreadPool.QueueUserWorkItem(ProcessFile, s);
}
// wait for all threads to finish
int count = 0;
while (count < 3)
{
mySem.WaitOne();
++count;
}
}
void ProcessFile(object state)
{
string fname = (string)state;
// do whatever
mySem.Release(); // release so another thread can start
}
The first will perform somewhat better because you don't have the overhead of starting and stopping a thread for each file name processed. The second is much shorter and cleaner, though, and takes full advantage of the thread pool. Likely you won't notice the performance difference.
Can set max threads unsing ParallelOptions
Parallel.For Method (Int32, Int32, ParallelOptions, Action)
ParallelOptions.MaxDegreeOfParallelism
var results = filenames.ToArray().AsParallel().Select(filename=>ProcessFile(filename)).ToArray();
bool ProcessFile(object fileNameObj)
{
var fileName = (string)fileNameObj;
// do your processing here.
return true;
}
Related
I have this function which checks for proxy servers and currently it checks only a number of threads and waits for all to finish until the next set is starting. Is it possible to start a new thread as soon as one is finished from the maximum allowed?
for (int i = 0; i < listProxies.Count(); i+=nThreadsNum)
{
for (nCurrentThread = 0; nCurrentThread < nThreadsNum; nCurrentThread++)
{
if (nCurrentThread < nThreadsNum)
{
string strProxyIP = listProxies[i + nCurrentThread].sIPAddress;
int nPort = listProxies[i + nCurrentThread].nPort;
tasks.Add(Task.Factory.StartNew<ProxyAddress>(() => CheckProxyServer(strProxyIP, nPort, nCurrentThread)));
}
}
Task.WaitAll(tasks.ToArray());
foreach (var tsk in tasks)
{
ProxyAddress result = tsk.Result;
UpdateProxyDBRecord(result.sIPAddress, result.bOnlineStatus);
}
tasks.Clear();
}
This seems much more simple:
int numberProcessed = 0;
Parallel.ForEach(listProxies,
new ParallelOptions { MaxDegreeOfParallelism = nThreadsNum },
(p)=> {
var result = CheckProxyServer(p.sIPAddress, s.nPort, Thread.CurrentThread.ManagedThreadId);
UpdateProxyDBRecord(result.sIPAddress, result.bOnlineStatus);
Interlocked.Increment(numberProcessed);
});
With slots:
var obj = new Object();
var slots = new List<int>();
Parallel.ForEach(listProxies,
new ParallelOptions { MaxDegreeOfParallelism = nThreadsNum },
(p)=> {
int threadId = Thread.CurrentThread.ManagedThreadId;
int slot = slots.IndexOf(threadId);
if (slot == -1)
{
lock(obj)
{
slots.Add(threadId);
}
slot = slots.IndexOf(threadId);
}
var result = CheckProxyServer(p.sIPAddress, s.nPort, slot);
UpdateProxyDBRecord(result.sIPAddress, result.bOnlineStatus);
});
I took a few shortcuts there to guarantee thread safety. You don't have to do the normal check-lock-check dance because there will never be two threads attempting to add the same threadid to the list, so the second check will always fail and isn't needed. Secondly, for the same reason, I don't believe you need to ever lock around the outer IndexOf either. That makes this a very highly efficient concurrent routine that rarely locks (it should only lock nThreadsNum times) no matter how many items are in the enumerable.
Another solution is to use a SemaphoreSlim or the Producer-Consumer Pattern using a BlockinCollection<T>. Both solution support cancellation.
SemaphoreSlim
private async Task CheckProxyServerAsync(IEnumerable<object> proxies)
{
var tasks = new List<Task>();
int currentThreadNumber = 0;
int maxNumberOfThreads = 8;
using (semaphore = new SemaphoreSlim(maxNumberOfThreads, maxNumberOfThreads))
{
foreach (var proxy in proxies)
{
// Asynchronously wait until thread is available if thread limit reached
await semaphore.WaitAsync();
string proxyIP = proxy.IPAddress;
int port = proxy.Port;
tasks.Add(Task.Run(() => CheckProxyServer(proxyIP, port, Interlocked.Increment(ref currentThreadNumber)))
.ContinueWith(
(task) =>
{
ProxyAddress result = task.Result;
// Method call must be thread-safe!
UpdateProxyDbRecord(result.IPAddress, result.OnlineStatus);
Interlocked.Decrement(ref currentThreadNumber);
// Allow to start next thread if thread limit was reached
semaphore.Release();
},
TaskContinuationOptions.OnlyOnRanToCompletion));
}
// Asynchronously wait until all tasks are completed
// to prevent premature disposal of semaphore
await Task.WhenAll(tasks);
}
}
Producer-Consumer Pattern
// Uses a fixed number of same threads
private async Task CheckProxyServerAsync(IEnumerable<ProxyInfo> proxies)
{
var pipe = new BlockingCollection<ProxyInfo>();
int maxNumberOfThreads = 8;
var tasks = new List<Task>();
// Create all threads (count == maxNumberOfThreads)
for (int currentThreadNumber = 0; currentThreadNumber < maxNumberOfThreads; currentThreadNumber++)
{
tasks.Add(
Task.Run(() => ConsumeProxyInfo(pipe, currentThreadNumber)));
}
proxies.ToList().ForEach(pipe.Add);
pipe.CompleteAdding();
await Task.WhenAll(tasks);
}
private void ConsumeProxyInfo(BlockingCollection<ProxyInfo> proxiesPipe, int currentThreadNumber)
{
while (!proxiesPipe.IsCompleted)
{
if (proxiesPipe.TryTake(out ProxyInfo proxy))
{
int port = proxy.Port;
string proxyIP = proxy.IPAddress;
ProxyAddress result = CheckProxyServer(proxyIP, port, currentThreadNumber);
// Method call must be thread-safe!
UpdateProxyDbRecord(result.IPAddress, result.OnlineStatus);
}
}
}
If I'm understanding your question properly, this is actually fairly simple to do with await Task.WhenAny. Basically, you keep a collection of all of the running tasks. Once you reach a certain number of tasks running, you wait for one or more of your tasks to finish, and then you remove the tasks that were completed from your collection and continue to add more tasks.
Here's an example of what I mean below:
var tasks = new List<Task>();
for (int i = 0; i < 20; i++)
{
// I want my list of tasks to contain at most 5 tasks at once
if (tasks.Count == 5)
{
// Wait for at least one of the tasks to complete
await Task.WhenAny(tasks.ToArray());
// Remove all of the completed tasks from the list
tasks = tasks.Where(t => !t.IsCompleted).ToList();
}
// Add some task to the list
tasks.Add(Task.Factory.StartNew(async delegate ()
{
await Task.Delay(1000);
}));
}
I suggest changing your approach slightly. Instead of starting and stopping threads, put your proxy server data in a concurrent queue, one item for each proxy server. Then create a fixed number of threads (or async tasks) to work on the queue. This is more likely to provide smooth performance (you aren't starting and stopping threads over and over, which has overhead) and is a lot easier to code, in my opinion.
A simple example:
class ProxyChecker
{
private ConcurrentQueue<ProxyInfo> _masterQueue = new ConcurrentQueue<ProxyInfo>();
public ProxyChecker(IEnumerable<ProxyInfo> listProxies)
{
foreach (var proxy in listProxies)
{
_masterQueue.Enqueue(proxy);
}
}
public async Task RunChecks(int maximumConcurrency)
{
var count = Math.Max(maximumConcurrency, _masterQueue.Count);
var tasks = Enumerable.Range(0, count).Select( i => WorkerTask() ).ToList();
await Task.WhenAll(tasks);
}
private async Task WorkerTask()
{
ProxyInfo proxyInfo;
while ( _masterList.TryDequeue(out proxyInfo))
{
DoTheTest(proxyInfo.IP, proxyInfo.Port)
}
}
}
I have some code that runs thousands of URLs through a third party library. Occasionally the method in the library hangs which takes up a thread. After a while all threads are taken up by processes doing nothing and it grinds to a halt.
I am using a SemaphoreSlim to control adding new threads so I can have an optimal number of tasks running. I need a way to identify tasks that have been running too long and then to kill them but also release a thread from the SemaphoreSlim so a new task can be created.
I am struggling with the approach here so I made some test code that immitates what I am doing. It create tasks that have a 10% chance of hanging so very quickly all threads have hung.
How should I be checking for these and killing them off?
Here is the code:
class Program
{
public static SemaphoreSlim semaphore;
public static List<Task> taskList;
static void Main(string[] args)
{
List<string> urlList = new List<string>();
Console.WriteLine("Generating list");
for (int i = 0; i < 1000; i++)
{
//adding random strings to simulate a large list of URLs to process
urlList.Add(Path.GetRandomFileName());
}
Console.WriteLine("Queueing tasks");
semaphore = new SemaphoreSlim(10, 10);
Task.Run(() => QueueTasks(urlList));
Console.ReadLine();
}
static void QueueTasks(List<string> urlList)
{
taskList = new List<Task>();
foreach (var url in urlList)
{
Console.WriteLine("{0} tasks can enter the semaphore.",
semaphore.CurrentCount);
semaphore.Wait();
taskList.Add(DoTheThing(url));
}
}
static async Task DoTheThing(string url)
{
Random rand = new Random();
// simulate the IO process
await Task.Delay(rand.Next(2000, 10000));
// add a 10% chance that the thread will hang simulating what happens occasionally with http request
int chance = rand.Next(1, 100);
if (chance <= 10)
{
while (true)
{
await Task.Delay(1000000);
}
}
semaphore.Release();
Console.WriteLine(url);
}
}
As people have already pointed out, Aborting threads in general is bad and there is no guaranteed way of doing it in C#. Using a separate process to do the work and then kill it is a slightly better idea than attempting Thread.Abort; but still not the best way to go. Ideally, you want co-operative threads/processes, which use IPC to decide when to bail out themselves. This way the cleanup is done properly.
With all that said, you can use code like below to do what you intend to do. I have written it assuming your task will be done in a thread. With slight changes, you can use the same logic to do your task in a process
The code is by no means bullet-proof and is meant to be illustrative. The concurrent code is not really tested well. Locks are held for longer than needed and some places I am not locking (like the Log function)
class TaskInfo {
public Thread Task;
public DateTime StartTime;
public TaskInfo(ParameterizedThreadStart startInfo, object startArg) {
Task = new Thread(startInfo);
Task.Start(startArg);
StartTime = DateTime.Now;
}
}
class Program {
const int MAX_THREADS = 1;
const int TASK_TIMEOUT = 6; // in seconds
const int CLEANUP_INTERVAL = TASK_TIMEOUT; // in seconds
public static SemaphoreSlim semaphore;
public static List<TaskInfo> TaskList;
public static object TaskListLock = new object();
public static Timer CleanupTimer;
static void Main(string[] args) {
List<string> urlList = new List<string>();
Log("Generating list");
for (int i = 0; i < 2; i++) {
//adding random strings to simulate a large list of URLs to process
urlList.Add(Path.GetRandomFileName());
}
Log("Queueing tasks");
semaphore = new SemaphoreSlim(MAX_THREADS, MAX_THREADS);
Task.Run(() => QueueTasks(urlList));
CleanupTimer = new Timer(CleanupTasks, null, CLEANUP_INTERVAL * 1000, CLEANUP_INTERVAL * 1000);
Console.ReadLine();
}
// TODO: Guard against re-entrancy
static void CleanupTasks(object state) {
Log("CleanupTasks started");
lock (TaskListLock) {
var now = DateTime.Now;
int n = TaskList.Count;
for (int i = n - 1; i >= 0; --i) {
var task = TaskList[i];
Log($"Checking task with ID {task.Task.ManagedThreadId}");
// kill processes running for longer than anticipated
if (task.Task.IsAlive && now.Subtract(task.StartTime).TotalSeconds >= TASK_TIMEOUT) {
Log("Cleaning up hung task");
task.Task.Abort();
}
// remove task if it is not alive
if (!task.Task.IsAlive) {
Log("Removing dead task from list");
TaskList.RemoveAt(i);
continue;
}
}
if (TaskList.Count == 0) {
Log("Disposing cleanup thread");
CleanupTimer.Dispose();
}
}
Log("CleanupTasks done");
}
static void QueueTasks(List<string> urlList) {
TaskList = new List<TaskInfo>();
foreach (var url in urlList) {
Log($"Trying to schedule url = {url}");
semaphore.Wait();
Log("Semaphore acquired");
ParameterizedThreadStart taskRoutine = obj => {
try {
DoTheThing((string)obj);
} finally {
Log("Releasing semaphore");
semaphore.Release();
}
};
var task = new TaskInfo(taskRoutine, url);
lock (TaskListLock)
TaskList.Add(task);
}
Log("All tasks queued");
}
// simulate all processes get hung
static void DoTheThing(string url) {
while (true)
Thread.Sleep(5000);
}
static void Log(string msg) {
Console.WriteLine("{0:HH:mm:ss.fff} Thread {1,2} {2}", DateTime.Now, Thread.CurrentThread.ManagedThreadId.ToString(), msg);
}
}
I'm working on my university project. One of main requirement is to use multithreading (user can choose threads numbers).
I'm new in C# and based on internet research. I choose ThreadPool.
I spent a lot of time observing how the threads act using parallel watch in VS and i have no idea how this thing works. For example threadNumber = 10 but parallel watch shows only 4 activated threads.
Here is my code:
public void calculateBeta()
{
var finished = new CountdownEvent(1);
for (int i = 0; i < threadNumber; i++)
{
finished.AddCount();
ThreadPool.QueueUserWorkItem(
(state) =>
{
try
{
doSth();
}
finally
{
finished.Signal();
}
});
}
finished.Signal();
finished.Wait();
}
What am I doing wrong? I tried to test this code with many different values of threads number and it didn't work as i looked for.
EDIT:
private void myTask(object index)
{
int z = (int)index;
double[] result = countBeta(createTableB(z), createTableDiagonalA(z));
int counter = 0;
if ((rest != 0) && (z == threadNumber - 1))
{
for (int j = z * numbersInRow; j < (z + 1) * numbersInRow + rest; j++)
{
N[j] = result[counter];
counter++;
}
}
else
{
for (int j = z * numbersInRow; j < (z + 1) * numbersInRow; j++)
{
N[j] = result[counter];
counter++;
}
}
threads[z] = true;
}
public void calculateBeta()
{
N = new double[num];
setThreadNumber(2);
checkThreadNumber();
setNumberInRow();
setRest();
threads = new bool[threadNumber];
for (int i = 0; i < threadNumber; i++)
{
Thread thread = new Thread(this.myTask);
thread.IsBackground = true;
thread.Start(i);
}
while (!checkThreads())
{
}
}
private bool checkThread()
{
bool result = true;
for (int i = 0; i < threads.Length; i++)
{
if (!threads[i])
result = false;
}
return result;
}
static void Main(string[] args)
{
Jacobi jacobi = new Jacobi();
Console.WriteLine("Metoda Jacobiego");
Console.WriteLine("Rozwiazywanie ukladu n-rownan z n-niewiadomymi Ax=b");
jacobi.getNum();
jacobi.getA();
jacobi.getB();
jacobi.calculateBeta();
jacobi.calculateM();
jacobi.calculateX();
jacobi.countNorms();
Console.ReadLine();
}
I need results from calculateBeta to further calculations. Sometimes threads are not finished yet but the program moves forward without data that need to be provided by threads. I'm using bool variable now but this solution is not an elegant way to deal with it(Creating bool table, checking if all thread are fnished) How can i manage with that in a different way?
This is because you're using ThreadPool to manage your threads. It will create a certain number of threads based on many factors. You can tweak some of the settings but by and large when you commit to using ThreadPool to managing your threads you commit to a black box. Check out GetMaxThreads and GetMinThreads and their setter counterparts for some of your options.
Check out this ThreadPool Architecture article on MSDN. It gives good background to the hows and whys of the class. But in the introductory paragraph you will see this sentence, which is key to your conundrum:
The thread pool is primarily used to reduce the number of application
threads and provide management of the worker threads.
If you want to have the kind of control where you launch 10 threads in quick succession you should avoid ThreadPool and just manage the threads yourself. Here is a simple, absolutely minimal example of launching ten threads and also passing different data to each, in this case an index:
void ButtonClickHandlerOrSomeOtherMethod()
{
for (int i=1; i<=10; i++) // using a 1-based index
{
new Thread(ThreadTask).Start(i);
}
}
void ThreadTask(object i)
{
Console.WriteLine("Thread " + i + " ID: " + Thread.CurrentThread.ManagedThreadId);
}
And some sample output:
Thread 1 ID: 19
Thread 2 ID: 34
Thread 3 ID: 26
Thread 4 ID: 5
Thread 5 ID: 36
Thread 6 ID: 18
Thread 7 ID: 9
Thread 8 ID: 38
Thread 9 ID: 39
Thread 10 ID: 40
Follow-up code demonstrating synching with threads and "waiting" until they are all finished:
void ButtonClickHandlerOrSomeOtherMethod()
{
// need a collection of threads to call Join after Start(s)
var threads = new List<Thread>();
// create threads, add to List and start them
for (int i=1; i<=10; i++) {
var thread = new Thread(ThreadTask);
threads.Add(thread);
// a background thread will allow main app to exit even
// if the thread is still running
thread.IsBackground = true;
thread.Start(i);
}
// call Join on each thread which makes this thread wait on
// all 10 other threads
foreach (var thread in threads)
thread.Join();
// this message will not show until all threads are finished
Console.WriteLine("All threads finished.");
}
void ThreadTask(object i)
{
Console.WriteLine("Thread " + i + " ID: " + Thread.CurrentThread.ManagedThreadId);
// introducing some randomness to how long a task "works on something"
Thread.Sleep(100 * new Random().Next(0, 10));
Console.WriteLine("Thread " + i + " finished.");
}
The whole design of the thread pool is that it doesn't have to create a new actual thread every time a new item is queued up. If the pool notices that it has items pending in the queue for an extended period of time it will eventually start spinning up new threads, over time. If you're continually saturating the thread pool with operations, you'll see the number of actual threads rise. It will also only add new threads up to a limit; based on what it feels is going to have the best throughput. For example, it will avoid creating a lot more threads than cores assuming all of the threads are actively running CPU bound work.
The idea of using the thread pool is if you don't care how many actual threads there are, but rather just want to have efficient throughput of the operations that you have, allowing the framework lots of freedom on how to best optimize that work. If you have very specific requirements as to how many threads you have, you'll need to create threads manually rather than using a pool.
// Array of threads launched.
// This array is useful to trace threads status.
Thread[] threads;
private void myTask(object index)
{
Console.Write("myTask {0} started\n", index);
Console.Write("myTask {0} finisced\n", index);
}
public void calculateBeta(UInt16 threadNumber)
{
// Allocate a new array with size of requested number of threads
threads = new Thread[threadNumber];
// For each thread
for (int i = 0; i < threadNumber; i++)
{
// Thread creation
threads[i] = new Thread(this.myTask);
// IsBackground set to true grants that the allication can be "killed" without wait for all threads termination
// This is useful in debug to be sure that an error in task doesn't freeze the app.
// Leave it to false in release
#if DEBUG
threads[i].IsBackground = true;
#endif
// Start the thread
threads[i].Start(i);
}
// Waits until all threads complete.
while (!checkThreads());
}
private bool checkThreads()
{
bool result = true;
for (int i = 0; i < threads.Length; i++)
{
// If the thread wasn't disposed
if (threads[i] != null)
{
// Check if the thead is alive (means is working)
if (threads[i].IsAlive == true)
{
result = false;
}
else // The thread is not working
{
// Dispose the thread
threads[i].Join();
// Set pointer to null to signal that the task was
threads[i] = null;
}
}
}
return result;
}
private void Button_Click(object sender, RoutedEventArgs e)
{
Console.Write("Starting tasks!!\n");
calculateBeta(10);
Console.Write("All tasks finished!!\n");
}
I create my threads as
for (int i = 0; i < threadCount; i++)
{
Searcher src = new Searcher(i, this);
threads[i] = new Thread(new ThreadStart(src.getIpRange));
threads[i].Name = string.Format(i.ToString());
}
foreach (Thread t in threads)
{
t.Start();
}
with threadCount(= 100, 150, 255 etc...) but I can't learn how many threads working. on execute time.
and I want to control when all threads finishes their job. and give me a message like "All threads are dead, jobs completed..."
like backgroundWorker's RunWorkerCompleted event
Determining when all the threads are finished is simple.
for (int i = 0; i < threadCount; i++)
{
threads[i].Join();
}
Console.WriteLine("All threads are done!");
Can you elaborate on your other requirements?
You can check the ThreadState property of the Thread.
Might be better to use async methods. This gives you a WaitHandle object, and you can use WaitHandle.WaitAll to wait for all of your async methods to finish.
Here's an intro to asynchronous programming:
http://msdn.microsoft.com/en-us/library/aa719598%28v=VS.71%29.aspx
You definitely want to use the Task class for this or a higher-level concept like Parallel.ForEach. Using the Thread class directly is quite painful.
I recently wrote a blog post comparing various asynchronous approaches, listed in order from best (Task) to worst (Thread).
Here's an example using Task, demonstrating what you wanted to do:
// Start all tasks
var threads = new Task[threadCount];
for (int i = 0; i < threadCount; i++)
{
Searcher src = new Searcher(i, this);
threads[i] = Task.Factory.StartNew(src.getIpRange);
}
// How many are running right now?
var runningCount = threads.Count(x => x.Status == TaskStatus.Running);
// Register a callback when they have all completed (this does not block)
Task.Factory.ContinueWhenAll(threads, MyCallback);
Add a delegate to Searcher and pass it a callback method from your main thread that each thread will call when it finishes. As you launch each thread, add it to a Dictionary keyed by the thread's ManagedThreadId. When each thread finishes, the callback removes the thread from the Dictionary and checks to see if the count is zero.
Dictionary<int, Thread> activeThreads = new Dictionary<int, Thread>();
for (int i = 0; i < threadCount; i++)
{
Searcher src = new Searcher(i, this);
src.Done = new SearcherDoneDelegate(ThreadDone);
threads[i] = new Thread(new ThreadStart(src.getIpRange));
threads[i].Name = string.Format(i.ToString());
}
foreach (Thread t in threads)
{
lock (activeThreads)
{
activeThreads.Add(t.ManagedThreadId, t);
}
t.Start();
}
}
public void ThreadDone(int threadIdArg)
{
lock (activeThreads)
{
activeThreads.Remove(threadIdArg);
if (activeThreads.Count == 0)
{
// all done
}
}
}
public delegate void SearcherDoneDelegate(int threadIdArg);
public static object locker = new object();
public class Searcher
{
public SearcherDoneDelegate Done { get; set; }
public void getIpRange()
{
Done(Thread.CurrentThread.ManagedThreadId);
}
}
If you have more threads than you want to run at one time, put them into a Queue and peel them off as older threads finish (use the callback).
First, I have to point out that creating 100, 150, 255, etc. threads is probably not a good idea. You might be better off using the ThreadPool or Task class (if using .NET 4.0). Aside from that there are two well established methods for waiting until all threads complete.
Join the thread.
Thread.Join blocks until the target thread finishes.
for (int i = 0; i < threadCount; i++)
{
Searcher src = new Searcher(i, this);
threads[i] = new Thread(new ThreadStart(src.getIpRange));
threads[i].Name = string.Format(i.ToString());
}
foreach (Thread t in threads)
{
t.Start();
}
foreach (Thread t in threads)
{
t.Join();
}
Use a CountdownEvent.
A CountdownEvent waits until its internal count reaches zero. This method is better suited if you want to use the ThreadPool. If you are not using .NET 4.0 you can get a really simple implementation over at Joe Albahari's website.
var finished = new CountdownEvent(1);
for (int i = 0; i < threadCount; i++)
{
finished.AddCount();
Searcher src = new Searcher(i, this);
threads[i] = new Thread(
() =>
{
try
{
src.getIpRange();
}
finally
{
finished.Signal();
}
}
threads[i].Name = string.Format(i.ToString());
}
foreach (Thread t in threads)
{
t.Start();
}
finished.Signal();
finished.WaitOne();
Why can't you use critical section protected single variable to control a number of active threads? Thread function can modify this variable (having entered critical section, of course).
I'm running into a common pattern in the code that I'm writing, where I need to wait for all threads in a group to complete, with a timeout. The timeout is supposed to be the time required for all threads to complete, so simply doing Thread.Join(timeout) for each thread won't work, since the possible timeout is then timeout * numThreads.
Right now I do something like the following:
var threadFinishEvents = new List<EventWaitHandle>();
foreach (DataObject data in dataList)
{
// Create local variables for the thread delegate
var threadFinish = new EventWaitHandle(false, EventResetMode.ManualReset);
threadFinishEvents.Add(threadFinish);
var localData = (DataObject) data.Clone();
var thread = new Thread(
delegate()
{
DoThreadStuff(localData);
threadFinish.Set();
}
);
thread.Start();
}
Mutex.WaitAll(threadFinishEvents.ToArray(), timeout);
However, it seems like there should be a simpler idiom for this sort of thing.
I still think using Join is simpler. Record the expected completion time (as Now+timeout), then, in a loop, do
if(!thread.Join(End-now))
throw new NotFinishedInTime();
With .NET 4.0 I find System.Threading.Tasks a lot easier to work with. Here's spin-wait loop which works reliably for me. It blocks the main thread until all the tasks complete. There's also Task.WaitAll, but that hasn't always worked for me.
for (int i = 0; i < N; i++)
{
tasks[i] = Task.Factory.StartNew(() =>
{
DoThreadStuff(localData);
});
}
while (tasks.Any(t => !t.IsCompleted)) { } //spin wait
This doesn't answer the question (no timeout), but I've made a very simple extension method to wait all threads of a collection:
using System.Collections.Generic;
using System.Threading;
namespace Extensions
{
public static class ThreadExtension
{
public static void WaitAll(this IEnumerable<Thread> threads)
{
if(threads!=null)
{
foreach(Thread thread in threads)
{ thread.Join(); }
}
}
}
}
Then you simply call:
List<Thread> threads=new List<Thread>();
//Add your threads to this collection
threads.WaitAll();
Since the question got bumped I will go ahead and post my solution.
using (var finished = new CountdownEvent(1))
{
for (DataObject data in dataList)
{
finished.AddCount();
var localData = (DataObject)data.Clone();
var thread = new Thread(
delegate()
{
try
{
DoThreadStuff(localData);
threadFinish.Set();
}
finally
{
finished.Signal();
}
}
);
thread.Start();
}
finished.Signal();
finished.Wait(YOUR_TIMEOUT);
}
Off the top of my head, why don't you just Thread.Join(timeout) and remove the time it took to join from the total timeout?
// pseudo-c#:
TimeSpan timeout = timeoutPerThread * threads.Count();
foreach (Thread thread in threads)
{
DateTime start = DateTime.Now;
if (!thread.Join(timeout))
throw new TimeoutException();
timeout -= (DateTime.Now - start);
}
Edit: code is now less pseudo. don't understand why you would mod an answer -2 when the answer you modded +4 is exactly the same, only less detailed.
This may not be an option for you, but if you can use the Parallel Extension for .NET then you could use Tasks instead of raw threads and then use Task.WaitAll() to wait for them to complete.
I read the book C# 4.0: The Complete Reference of Herbert Schildt. The author use join to give a solution :
class MyThread
{
public int Count;
public Thread Thrd;
public MyThread(string name)
{
Count = 0;
Thrd = new Thread(this.Run);
Thrd.Name = name;
Thrd.Start();
}
// Entry point of thread.
void Run()
{
Console.WriteLine(Thrd.Name + " starting.");
do
{
Thread.Sleep(500);
Console.WriteLine("In " + Thrd.Name +
", Count is " + Count);
Count++;
} while (Count < 10);
Console.WriteLine(Thrd.Name + " terminating.");
}
}
// Use Join() to wait for threads to end.
class JoinThreads
{
static void Main()
{
Console.WriteLine("Main thread starting.");
// Construct three threads.
MyThread mt1 = new MyThread("Child #1");
MyThread mt2 = new MyThread("Child #2");
MyThread mt3 = new MyThread("Child #3");
mt1.Thrd.Join();
Console.WriteLine("Child #1 joined.");
mt2.Thrd.Join();
Console.WriteLine("Child #2 joined.");
mt3.Thrd.Join();
Console.WriteLine("Child #3 joined.");
Console.WriteLine("Main thread ending.");
Console.ReadKey();
}
}
I was tying to figure out how to do this but i could not get any answers from google.
I know this is an old thread but here was my solution:
Use the following class:
class ThreadWaiter
{
private int _numThreads = 0;
private int _spinTime;
public ThreadWaiter(int SpinTime)
{
this._spinTime = SpinTime;
}
public void AddThreads(int numThreads)
{
_numThreads += numThreads;
}
public void RemoveThread()
{
if (_numThreads > 0)
{
_numThreads--;
}
}
public void Wait()
{
while (_numThreads != 0)
{
System.Threading.Thread.Sleep(_spinTime);
}
}
}
Call Addthreads(int numThreads) before executing a thread(s).
Call RemoveThread() after each one has completed.
Use Wait() at the point that you want to wait for all the threads to complete
before continuing
Possible solution:
var tasks = dataList
.Select(data => Task.Factory.StartNew(arg => DoThreadStuff(data), TaskContinuationOptions.LongRunning | TaskContinuationOptions.PreferFairness))
.ToArray();
var timeout = TimeSpan.FromMinutes(1);
Task.WaitAll(tasks, timeout);
Assuming dataList is the list of items and each item needs to be processed in a separate thread.
Here is an implementation inspired by Martin v. Löwis's answer:
/// <summary>
/// Blocks the calling thread until all threads terminate, or the specified
/// time elapses. Returns true if all threads terminated in time, or false if
/// at least one thread has not terminated after the specified amount of time
/// elapsed.
/// </summary>
public static bool JoinAll(IEnumerable<Thread> threads, TimeSpan timeout)
{
ArgumentNullException.ThrowIfNull(threads);
if (timeout < TimeSpan.Zero)
throw new ArgumentOutOfRangeException(nameof(timeout));
Stopwatch stopwatch = Stopwatch.StartNew();
foreach (Thread thread in threads)
{
if (!thread.IsAlive) continue;
TimeSpan remaining = timeout - stopwatch.Elapsed;
if (remaining < TimeSpan.Zero) return false;
if (!thread.Join(remaining)) return false;
}
return true;
}
For measuring the remaining time, instead of the DateTime.Now it uses a Stopwatch. The Stopwatch component is not sensitive to system-wide clock adjustments.
Usage example:
bool allTerminated = JoinAll(new[] { thread1, thread2 }, TimeSpan.FromSeconds(10));
The timeout must be a positive or zero TimeSpan. The Timeout.InfiniteTimeSpan constant is not supported.