Do I need two queues for producer-consumer pattern? - c#

I have a difficult time to understand the BlockingCollection.
The following code is from an old answer but it used two queues.
using System;
using System.Collections.Concurrent;
using System.Threading;
using System.Threading.Tasks;
namespace Demo
{
class Program
{
static void Main(string[] args)
{
new Program().run();
}
void run()
{
int threadCount = 4;
Task[] workers = new Task[threadCount];
Task.Factory.StartNew(consumer);
for (int i = 0; i < threadCount; ++i)
{
int workerId = i;
Task task = new Task(() => worker(workerId));
workers[i] = task;
task.Start();
}
for (int i = 0; i < 100; ++i)
{
Console.WriteLine("Queueing work item {0}", i);
inputQueue.Add(i);
Thread.Sleep(50);
}
Console.WriteLine("Stopping adding.");
inputQueue.CompleteAdding();
Task.WaitAll(workers);
outputQueue.CompleteAdding();
Console.WriteLine("Done.");
Console.ReadLine();
}
void worker(int workerId)
{
Console.WriteLine("Worker {0} is starting.", workerId);
foreach (var workItem in inputQueue.GetConsumingEnumerable())
{
Console.WriteLine("Worker {0} is processing item {1}", workerId, workItem);
Thread.Sleep(100); // Simulate work.
outputQueue.Add(workItem); // Output completed item.
}
Console.WriteLine("Worker {0} is stopping.", workerId);
}
void consumer()
{
Console.WriteLine("Consumer is starting.");
foreach (var workItem in outputQueue.GetConsumingEnumerable())
{
Console.WriteLine("Consumer is using item {0}", workItem);
Thread.Sleep(25);
}
Console.WriteLine("Consumer is finished.");
}
BlockingCollection<int> inputQueue = new BlockingCollection<int>();
BlockingCollection<int> outputQueue = new BlockingCollection<int>();
}
}
Can I just use one queue? If using .net 4.5(TPL) to replace it, what can I do?

Producer-consumer is a model that defines two actors:
One who creates items - producer
One who processes those items - consumer
There are different ways to organize their interoperation, but the most simple one is Queue - producer enqueues new items to it, consumer dequeues them.
The code example shown above actually contains three actors and two queues:
Producer 1 (in Main)
for (int i = 0; i < 100; ++i)
{
Console.WriteLine("Queueing work item {0}", i);
inputQueue.Add(i);
Thread.Sleep(50);
}
Consumer 1 and Producer 2 at the same time ( worker method):
foreach (var workItem in inputQueue.GetConsumingEnumerable())
{
Console.WriteLine("Worker {0} is processing item {1}", workerId, workItem);
Thread.Sleep(100); // Simulate work.
outputQueue.Add(workItem); // Output completed item.
}
Consumer 2 (consumer method):
foreach (var workItem in outputQueue.GetConsumingEnumerable())
{
Console.WriteLine("Consumer is using item {0}", workItem);
Thread.Sleep(25);
}
This is not producer-consumer - it is Conveyor "pattern" organized upon two sequentially connected producer-consumer systems.
So, the ANSWER:
No, there is no need to use two queues. You can reduce the example to proper one-level producer-consumer just removing the worker method and operating in the consumer method upon inputQueue.
ABOUT THE old answer:
The old question does not deal with producer-consumer - it deals with conveyor model - data flow. Simple producer-consumer does not need any complex multilevel data transitions.

Related

C# Multithreading with slots

I have this function which checks for proxy servers and currently it checks only a number of threads and waits for all to finish until the next set is starting. Is it possible to start a new thread as soon as one is finished from the maximum allowed?
for (int i = 0; i < listProxies.Count(); i+=nThreadsNum)
{
for (nCurrentThread = 0; nCurrentThread < nThreadsNum; nCurrentThread++)
{
if (nCurrentThread < nThreadsNum)
{
string strProxyIP = listProxies[i + nCurrentThread].sIPAddress;
int nPort = listProxies[i + nCurrentThread].nPort;
tasks.Add(Task.Factory.StartNew<ProxyAddress>(() => CheckProxyServer(strProxyIP, nPort, nCurrentThread)));
}
}
Task.WaitAll(tasks.ToArray());
foreach (var tsk in tasks)
{
ProxyAddress result = tsk.Result;
UpdateProxyDBRecord(result.sIPAddress, result.bOnlineStatus);
}
tasks.Clear();
}
This seems much more simple:
int numberProcessed = 0;
Parallel.ForEach(listProxies,
new ParallelOptions { MaxDegreeOfParallelism = nThreadsNum },
(p)=> {
var result = CheckProxyServer(p.sIPAddress, s.nPort, Thread.CurrentThread.ManagedThreadId);
UpdateProxyDBRecord(result.sIPAddress, result.bOnlineStatus);
Interlocked.Increment(numberProcessed);
});
With slots:
var obj = new Object();
var slots = new List<int>();
Parallel.ForEach(listProxies,
new ParallelOptions { MaxDegreeOfParallelism = nThreadsNum },
(p)=> {
int threadId = Thread.CurrentThread.ManagedThreadId;
int slot = slots.IndexOf(threadId);
if (slot == -1)
{
lock(obj)
{
slots.Add(threadId);
}
slot = slots.IndexOf(threadId);
}
var result = CheckProxyServer(p.sIPAddress, s.nPort, slot);
UpdateProxyDBRecord(result.sIPAddress, result.bOnlineStatus);
});
I took a few shortcuts there to guarantee thread safety. You don't have to do the normal check-lock-check dance because there will never be two threads attempting to add the same threadid to the list, so the second check will always fail and isn't needed. Secondly, for the same reason, I don't believe you need to ever lock around the outer IndexOf either. That makes this a very highly efficient concurrent routine that rarely locks (it should only lock nThreadsNum times) no matter how many items are in the enumerable.
Another solution is to use a SemaphoreSlim or the Producer-Consumer Pattern using a BlockinCollection<T>. Both solution support cancellation.
SemaphoreSlim
private async Task CheckProxyServerAsync(IEnumerable<object> proxies)
{
var tasks = new List<Task>();
int currentThreadNumber = 0;
int maxNumberOfThreads = 8;
using (semaphore = new SemaphoreSlim(maxNumberOfThreads, maxNumberOfThreads))
{
foreach (var proxy in proxies)
{
// Asynchronously wait until thread is available if thread limit reached
await semaphore.WaitAsync();
string proxyIP = proxy.IPAddress;
int port = proxy.Port;
tasks.Add(Task.Run(() => CheckProxyServer(proxyIP, port, Interlocked.Increment(ref currentThreadNumber)))
.ContinueWith(
(task) =>
{
ProxyAddress result = task.Result;
// Method call must be thread-safe!
UpdateProxyDbRecord(result.IPAddress, result.OnlineStatus);
Interlocked.Decrement(ref currentThreadNumber);
// Allow to start next thread if thread limit was reached
semaphore.Release();
},
TaskContinuationOptions.OnlyOnRanToCompletion));
}
// Asynchronously wait until all tasks are completed
// to prevent premature disposal of semaphore
await Task.WhenAll(tasks);
}
}
Producer-Consumer Pattern
// Uses a fixed number of same threads
private async Task CheckProxyServerAsync(IEnumerable<ProxyInfo> proxies)
{
var pipe = new BlockingCollection<ProxyInfo>();
int maxNumberOfThreads = 8;
var tasks = new List<Task>();
// Create all threads (count == maxNumberOfThreads)
for (int currentThreadNumber = 0; currentThreadNumber < maxNumberOfThreads; currentThreadNumber++)
{
tasks.Add(
Task.Run(() => ConsumeProxyInfo(pipe, currentThreadNumber)));
}
proxies.ToList().ForEach(pipe.Add);
pipe.CompleteAdding();
await Task.WhenAll(tasks);
}
private void ConsumeProxyInfo(BlockingCollection<ProxyInfo> proxiesPipe, int currentThreadNumber)
{
while (!proxiesPipe.IsCompleted)
{
if (proxiesPipe.TryTake(out ProxyInfo proxy))
{
int port = proxy.Port;
string proxyIP = proxy.IPAddress;
ProxyAddress result = CheckProxyServer(proxyIP, port, currentThreadNumber);
// Method call must be thread-safe!
UpdateProxyDbRecord(result.IPAddress, result.OnlineStatus);
}
}
}
If I'm understanding your question properly, this is actually fairly simple to do with await Task.WhenAny. Basically, you keep a collection of all of the running tasks. Once you reach a certain number of tasks running, you wait for one or more of your tasks to finish, and then you remove the tasks that were completed from your collection and continue to add more tasks.
Here's an example of what I mean below:
var tasks = new List<Task>();
for (int i = 0; i < 20; i++)
{
// I want my list of tasks to contain at most 5 tasks at once
if (tasks.Count == 5)
{
// Wait for at least one of the tasks to complete
await Task.WhenAny(tasks.ToArray());
// Remove all of the completed tasks from the list
tasks = tasks.Where(t => !t.IsCompleted).ToList();
}
// Add some task to the list
tasks.Add(Task.Factory.StartNew(async delegate ()
{
await Task.Delay(1000);
}));
}
I suggest changing your approach slightly. Instead of starting and stopping threads, put your proxy server data in a concurrent queue, one item for each proxy server. Then create a fixed number of threads (or async tasks) to work on the queue. This is more likely to provide smooth performance (you aren't starting and stopping threads over and over, which has overhead) and is a lot easier to code, in my opinion.
A simple example:
class ProxyChecker
{
private ConcurrentQueue<ProxyInfo> _masterQueue = new ConcurrentQueue<ProxyInfo>();
public ProxyChecker(IEnumerable<ProxyInfo> listProxies)
{
foreach (var proxy in listProxies)
{
_masterQueue.Enqueue(proxy);
}
}
public async Task RunChecks(int maximumConcurrency)
{
var count = Math.Max(maximumConcurrency, _masterQueue.Count);
var tasks = Enumerable.Range(0, count).Select( i => WorkerTask() ).ToList();
await Task.WhenAll(tasks);
}
private async Task WorkerTask()
{
ProxyInfo proxyInfo;
while ( _masterList.TryDequeue(out proxyInfo))
{
DoTheTest(proxyInfo.IP, proxyInfo.Port)
}
}
}

Random Tasks of Task.Factory.StartNew don't start

I need to have 5 tasks completed in parallel with max 2 executed at a time.
So, as soon as some task is finished, the next should be run - up until there are no tasks pending.
I'm using a solution by L.B. which involves using semaphores for synchronization across tasks.
void LaunchTaskPool ()
{
SemaphoreSlim maxThreadSemaphore = new SemaphoreSlim(2); //Max 2 tasks at a time.
for (int i = 0; i < 5; i++) //loop through 5 tasks to be assigned
{
maxThreadSemaphore.Wait(); //Wait for the queue
Console.WriteLine("Assigning work {0} ", i);
Task t = Task.Factory.StartNew(() =>
{
DoWork(i.ToString()); // assign tasks
}, TaskCreationOptions.LongRunning
)
.ContinueWith(
(task) => maxThreadSemaphore.Release() // step out of the queue
);
}
}
void DoWork(string workname)
{
Thread.Sleep(100);
Console.WriteLine("--work {0} starts", workname);
Thread.Sleep(1000);
Console.WriteLine("--work {0} finishes", workname);
}
The problem is that some random tasks would not even start. For example here Work 1 and 3 never started and Work 4 got run twice:
I tried adding Task.WaitAll() as suggested here, but it didn't help.
thanks in advance for your suggestions!
Constantine.
I recommend using Parallel.For() for this instead; there's no need to reinvent the wheel! You can specify MaxDegreeOfParallelism when using Parallel.For():
For example:
using System;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApp4
{
class Program
{
static void Main()
{
Parallel.For(
0, // Inclusive start
5, // Exclusive end
new ParallelOptions{MaxDegreeOfParallelism = 2},
i => DoWork(i.ToString()));
}
static void DoWork(string workname)
{
Thread.Sleep(100);
Console.WriteLine("--work {0} starts", workname);
Thread.Sleep(1000);
Console.WriteLine("--work {0} finishes", workname);
}
}
}
(Actually I just looked, and this is already in one of the other answers in the thread you linked - is there a reason you didn't want to use that solution? If not, I guess we should close this question as a duplicate...)
Anyway to answer your actual question:
You are accessing a "modified closure" in the loop. To fix this, make a copy of the loop variable i before passing it to the task:
SemaphoreSlim maxThreadSemaphore = new SemaphoreSlim(2); //Max 2 tasks at a time.
for (int i = 0; i < 5; i++) //loop through 5 tasks to be assigned
{
maxThreadSemaphore.Wait(); //Wait for the queue
Console.WriteLine("Assigning work {0} ", i);
int copy = i; // <----- Make a copy here.
Task t = Task.Factory.StartNew(() =>
{
DoWork(copy.ToString()); // assign tasks
}, TaskCreationOptions.LongRunning
)
.ContinueWith(
(task) => maxThreadSemaphore.Release() // step out of the queue
);
}
The problem with your solution is that before the Task is started the loop has allready run through and is starting the next Task.
As #Matthew Watson recommended you should use Parallel.For.
Just out of interest this would solve your problem:
static void LaunchTaskPool()
{
SemaphoreSlim maxThreadSemaphore = new SemaphoreSlim(2); //Max 2 tasks at a time.
for (int i = 0; i < 5; i++) //loop through 5 tasks to be assigned
{
maxThreadSemaphore.Wait(); //Wait for the queue
Console.WriteLine("Assigning work {0} ", i);
StartThead(i, maxThreadSemaphore);
}
}
static void StartThead(int i, SemaphoreSlim maxThreadSemaphore)
{
Task.Factory.StartNew(
() => DoWork(i.ToString()),
TaskCreationOptions.None
).ContinueWith((task) => maxThreadSemaphore.Release());
}
static void DoWork(string workname)
{
Thread.Sleep(100);
Console.WriteLine("--work {0} starts", workname);
Thread.Sleep(1000);
Console.WriteLine("--work {0} finishes", workname);
}

I am working on implementing job queue. Using TPL for the same. Trying to restrict number of jobs at a time using ThreadPool.SetMaxThreads. no luck

****To Restrict the thread :****
int workerThreads, completionPortThreads;
ThreadPool.GetMaxThreads(out workerThreads, out completionPortThreads);
workerThreads = 2;
ThreadPool.SetMaxThreads(workerThreads, completionPortThreads);
To run the job I tried 2 options
Option 1.
ThreadPool.QueueUserWorkItem(new WaitCallback(ThreadProc),task);
Option 2:
Task runner = new Task(() => taskProcessor.ImportIntoArt(task),TaskCreationOptions.LongRunning|TaskCreationOptions.PreferFairness);
runner.Start();
I expect this code has to pick up first two jobs for processing and 3rd one should go in to the queue. As expected first two jobs will start, however 3rd one will also be picked up for processing.
Any help is highly appreciated.
Use the QueuedTaskScheduler from this package in conjunction with Task.Factory.StartNew method:
var scheduler = new QueuedTaskScheduler(TaskScheduler.Default, 2);
var jobAction = new Action<string>(
jobName =>
{
Console.WriteLine("I am job " + jobName + " and I start at " + DateTime.Now.ToLongTimeString());
Thread.Sleep(10000);
Console.WriteLine("I am job " + jobName + " and I finish at " + DateTime.Now.ToLongTimeString());
});
var jobs = Enumerable
.Range(1, 6)
.Select(num => Task.Factory.StartNew(
() => jobAction("Job" + num),
CancellationToken.None,
TaskCreationOptions.LongRunning,
scheduler))
.ToList();
Task.WhenAll(jobs).Wait();
I know you want to achieve this task using TPL, but as #stuartd has made a comment that we can't do that with threadpool, then you can achieve this task traditional way by creating required number of thread and run them infinitely and observe the collection of a task which of type query.
Please refer below code if you want to achieve the task without using other libraries.
//Declare queue of task.
static Queue<int> taskQueue = new Queue<int>();
static readonly object lockObj = new object();
//Get task to perform.
static int? GetNextTask()
{
lock (lockObj)
{
if (taskQueue.Count > 0)
return taskQueue.Dequeue();
else return null;
}
}
//Add task to queue from different thread.
static void AddTask(int task)
{
lock (lockObj)
{
taskQueue.Enqueue(task);
}
}
static void PerformThreadOperation()
{
//Run infinite for current thread.
while (true)
{
var task = GetNextTask();
//If there is task then perform some action else make thread sleep for some time you can set event to resume thread.
if (task.HasValue)
{
Console.WriteLine("Task Initiate => {0}", task.Value);
Thread.Sleep(4000);
Console.WriteLine("Task Complete => {0}", task.Value);
}
else
{
Console.WriteLine("Task not found, thread is going to be sleep for some moment.");
Console.WriteLine("Thread {0} enter in sleep mode.", Thread.CurrentThread.Name);
Thread.Sleep(5000);
}
}
}
//Create required thread to process task parallely.
static void TestThreadApplication()
{
Thread thread = new Thread(new ThreadStart(PerformThreadOperation));
Thread thread1 = new Thread(PerformThreadOperation);
thread.Start();
thread1.Start();
}
static void Main(string[] args)
{
for (int i = 0; i < 6; i++)
{
taskQueue.Enqueue(i);
}
TestThreadApplication();
Thread.Sleep(20000);
for (int i = 6; i < 10; i++)
{
taskQueue.Enqueue(i);
}
Console.ReadKey();
}

How to Perform Interlocked.Increment/Decrement Properly in IO Bound?

I have simple IO bound 4.0 console application, which send 1 to n requests to a web-service and wait for their completion and then exit. Here is a sample,
static int counter = 0;
static void Main(string[] args)
{
foreach (my Loop)
{
......................
WebClientHelper.PostDataAsync(... =>
{
................................
................................
Interlocked.Decrement(ref counter);
});
Interlocked.Increment(ref counter);
}
while(counter != 0)
{
Thread.Sleep(500);
}
}
Is this is correct implementation?
You can use Tasks. Let TPL manage those things.
Task<T>[] tasks = ...;
//Started the tasks
Task.WaitAll(tasks);
Another way is to use TaskCompletionSource as mentioned here.
As suggested by Hans, here's your code implemented with CountdownEvent:
static void Main(string[] args)
{
var counter = new CountdownEvent();
foreach (my Loop)
{
......................
WebClientHelper.PostDataAsync(... =>
{
................................
................................
counter.Signal();
});
counter.AddCount();
}
counter.Wait();
}

Waiting for all threads to complete, with a timeout

I'm running into a common pattern in the code that I'm writing, where I need to wait for all threads in a group to complete, with a timeout. The timeout is supposed to be the time required for all threads to complete, so simply doing Thread.Join(timeout) for each thread won't work, since the possible timeout is then timeout * numThreads.
Right now I do something like the following:
var threadFinishEvents = new List<EventWaitHandle>();
foreach (DataObject data in dataList)
{
// Create local variables for the thread delegate
var threadFinish = new EventWaitHandle(false, EventResetMode.ManualReset);
threadFinishEvents.Add(threadFinish);
var localData = (DataObject) data.Clone();
var thread = new Thread(
delegate()
{
DoThreadStuff(localData);
threadFinish.Set();
}
);
thread.Start();
}
Mutex.WaitAll(threadFinishEvents.ToArray(), timeout);
However, it seems like there should be a simpler idiom for this sort of thing.
I still think using Join is simpler. Record the expected completion time (as Now+timeout), then, in a loop, do
if(!thread.Join(End-now))
throw new NotFinishedInTime();
With .NET 4.0 I find System.Threading.Tasks a lot easier to work with. Here's spin-wait loop which works reliably for me. It blocks the main thread until all the tasks complete. There's also Task.WaitAll, but that hasn't always worked for me.
for (int i = 0; i < N; i++)
{
tasks[i] = Task.Factory.StartNew(() =>
{
DoThreadStuff(localData);
});
}
while (tasks.Any(t => !t.IsCompleted)) { } //spin wait
This doesn't answer the question (no timeout), but I've made a very simple extension method to wait all threads of a collection:
using System.Collections.Generic;
using System.Threading;
namespace Extensions
{
public static class ThreadExtension
{
public static void WaitAll(this IEnumerable<Thread> threads)
{
if(threads!=null)
{
foreach(Thread thread in threads)
{ thread.Join(); }
}
}
}
}
Then you simply call:
List<Thread> threads=new List<Thread>();
//Add your threads to this collection
threads.WaitAll();
Since the question got bumped I will go ahead and post my solution.
using (var finished = new CountdownEvent(1))
{
for (DataObject data in dataList)
{
finished.AddCount();
var localData = (DataObject)data.Clone();
var thread = new Thread(
delegate()
{
try
{
DoThreadStuff(localData);
threadFinish.Set();
}
finally
{
finished.Signal();
}
}
);
thread.Start();
}
finished.Signal();
finished.Wait(YOUR_TIMEOUT);
}
Off the top of my head, why don't you just Thread.Join(timeout) and remove the time it took to join from the total timeout?
// pseudo-c#:
TimeSpan timeout = timeoutPerThread * threads.Count();
foreach (Thread thread in threads)
{
DateTime start = DateTime.Now;
if (!thread.Join(timeout))
throw new TimeoutException();
timeout -= (DateTime.Now - start);
}
Edit: code is now less pseudo. don't understand why you would mod an answer -2 when the answer you modded +4 is exactly the same, only less detailed.
This may not be an option for you, but if you can use the Parallel Extension for .NET then you could use Tasks instead of raw threads and then use Task.WaitAll() to wait for them to complete.
I read the book C# 4.0: The Complete Reference of Herbert Schildt. The author use join to give a solution :
class MyThread
{
public int Count;
public Thread Thrd;
public MyThread(string name)
{
Count = 0;
Thrd = new Thread(this.Run);
Thrd.Name = name;
Thrd.Start();
}
// Entry point of thread.
void Run()
{
Console.WriteLine(Thrd.Name + " starting.");
do
{
Thread.Sleep(500);
Console.WriteLine("In " + Thrd.Name +
", Count is " + Count);
Count++;
} while (Count < 10);
Console.WriteLine(Thrd.Name + " terminating.");
}
}
// Use Join() to wait for threads to end.
class JoinThreads
{
static void Main()
{
Console.WriteLine("Main thread starting.");
// Construct three threads.
MyThread mt1 = new MyThread("Child #1");
MyThread mt2 = new MyThread("Child #2");
MyThread mt3 = new MyThread("Child #3");
mt1.Thrd.Join();
Console.WriteLine("Child #1 joined.");
mt2.Thrd.Join();
Console.WriteLine("Child #2 joined.");
mt3.Thrd.Join();
Console.WriteLine("Child #3 joined.");
Console.WriteLine("Main thread ending.");
Console.ReadKey();
}
}
I was tying to figure out how to do this but i could not get any answers from google.
I know this is an old thread but here was my solution:
Use the following class:
class ThreadWaiter
{
private int _numThreads = 0;
private int _spinTime;
public ThreadWaiter(int SpinTime)
{
this._spinTime = SpinTime;
}
public void AddThreads(int numThreads)
{
_numThreads += numThreads;
}
public void RemoveThread()
{
if (_numThreads > 0)
{
_numThreads--;
}
}
public void Wait()
{
while (_numThreads != 0)
{
System.Threading.Thread.Sleep(_spinTime);
}
}
}
Call Addthreads(int numThreads) before executing a thread(s).
Call RemoveThread() after each one has completed.
Use Wait() at the point that you want to wait for all the threads to complete
before continuing
Possible solution:
var tasks = dataList
.Select(data => Task.Factory.StartNew(arg => DoThreadStuff(data), TaskContinuationOptions.LongRunning | TaskContinuationOptions.PreferFairness))
.ToArray();
var timeout = TimeSpan.FromMinutes(1);
Task.WaitAll(tasks, timeout);
Assuming dataList is the list of items and each item needs to be processed in a separate thread.
Here is an implementation inspired by Martin v. Löwis's answer:
/// <summary>
/// Blocks the calling thread until all threads terminate, or the specified
/// time elapses. Returns true if all threads terminated in time, or false if
/// at least one thread has not terminated after the specified amount of time
/// elapsed.
/// </summary>
public static bool JoinAll(IEnumerable<Thread> threads, TimeSpan timeout)
{
ArgumentNullException.ThrowIfNull(threads);
if (timeout < TimeSpan.Zero)
throw new ArgumentOutOfRangeException(nameof(timeout));
Stopwatch stopwatch = Stopwatch.StartNew();
foreach (Thread thread in threads)
{
if (!thread.IsAlive) continue;
TimeSpan remaining = timeout - stopwatch.Elapsed;
if (remaining < TimeSpan.Zero) return false;
if (!thread.Join(remaining)) return false;
}
return true;
}
For measuring the remaining time, instead of the DateTime.Now it uses a Stopwatch. The Stopwatch component is not sensitive to system-wide clock adjustments.
Usage example:
bool allTerminated = JoinAll(new[] { thread1, thread2 }, TimeSpan.FromSeconds(10));
The timeout must be a positive or zero TimeSpan. The Timeout.InfiniteTimeSpan constant is not supported.

Categories

Resources