C# (.net 3.5) run thread together - c#

How can I sync threads to run together.
For example:
code
Code section A
code
I want that every 5 thread will enter together to the Section A

Here's some sample code which shows how to use the Barrier class to wait for 5 threads to all be at the same point in the code before being allowed to carry on.
To try it out, run it and then ^C to stop it after a while, and inspect the times when the threads pass the barrier. You'll see that it is waiting until 5 threads are all at the barrier, then they are all released at once (whereupon the Barrier waits for the next 5 threads to be ready).
using System;
using System.Diagnostics;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApplication1
{
class Program
{
static void Main()
{
Barrier barrier = new Barrier(5); // 5 == #participating threads.
Action[] actions = new Action[10];
var sw = Stopwatch.StartNew();
ThreadPool.SetMinThreads(12, 12); // Prevent delay on starting threads.
// Not recommended for non-test code!
for (int i = 0; i < actions.Length; ++i)
actions[i] = () => test(barrier, sw);
Parallel.Invoke(actions);
}
static void test(Barrier barrier, Stopwatch sw)
{
int id = Thread.CurrentThread.ManagedThreadId;
Random rng = new Random(id);
while (true)
{
int wait = 5000 + rng.Next(5000);
Console.WriteLine($"[{sw.ElapsedMilliseconds:000000}] Thread {id} is sleeping for {wait} ms.");
Thread.Sleep(wait);
Console.WriteLine($"[{sw.ElapsedMilliseconds:000000}] Thread {id} is waiting at the barrier.");
barrier.SignalAndWait();
Console.WriteLine($"[{sw.ElapsedMilliseconds:000000}] Thread {id} passed the barrier.");
Thread.Sleep(1000); // This is the equivalent of your "Section A".
}
}
}
}

Related

Blocking waiting of Task.WaitAll

I'm actually studying async/wait and trying to see for myself the benefit of await Task.WhenAll versus Task.WaitAll in CPU bound operations. As everyone write that Task.WaitAll provides a blocking wait while await Task.WhenAll provides a non-blocking wait.
I created an example in which I wanted to replace Task.WaitAll with an await Task.WhenAll and see with my own eyes that there was one more free thread. But I see that even Task.WaitAll does not block the thread. And my question is related to this. In the case of Task.WaitAll, I see that in the same thread in which Task.WaitAll is executed, another task is being executed. But if I include Thread.Sleep or while (true) instead of Task.WaitAll, then the behavior of the program becomes as expected.
I thought that the Main method will create task MyTask (-1 worker thread), which will create 16 tasks conditionally B1-B16 (-15 worker threads since 1 worker thread is busy with task MyTask, and there are 16 worker threads in total), task MyTask will have a blocking wait Task.WaitAll and I will see 15 out of 16 running tasks. But I see all 16 running tasks and one of them is running on the same thread that task MyTask is running on.
Question.
Why does Task.WaitAll not block the thread in which it is executed in this example, unlike Thread.Sleep or while (true)? Can someone explain step by step how the code of two tasks in thread 4 works in case of using Task.WaitAll? Why is the thread in which task MyTask runs also used by task conditionally B16?
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApp1
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine($"Main Thread: {Thread.CurrentThread.ManagedThreadId}");
int ProcessorCount = Environment.ProcessorCount;
ThreadPool.SetMaxThreads(ProcessorCount, ProcessorCount);
int Counter = 0;
List<Task> MyListForTask = new List<Task>();
void MyMethod()
{
lock (MyListForTask)
{
Counter++;
Console.WriteLine($"Counter: {Counter} Thread: {Thread.CurrentThread.ManagedThreadId}");
}
//Thread.Sleep(int.MaxValue);
while (true) { };
}
Task MyTask = Task.Run(() =>
{
Console.WriteLine($"MyTask Thread: {Thread.CurrentThread.ManagedThreadId}\n");
for (int i = 0; i < ProcessorCount; i++)
{
MyListForTask.Add(Task.Run(MyMethod));
}
//Thread.Sleep(int.MaxValue);
//while (true) { };
Task.WaitAll(MyListForTask.ToArray());
});
MyTask.Wait();
}
}
}
The whole point of multithreading / asynchronous programming is to use your CPU resources as effectively as possible and you do not care about the order of operation.
There's no guarantee that the order the Tasks were started in, they will also be completed in.
Thread.Sleep, as the name implies, actively blocks the CPU thread (will not pick up another task) and waits until the required condition has been met (x time passed) before executing the task - and only then picking another task. In short, Thread.Sleep prevents asynchronous behavior from occuring.
Here you can see more intuitively what each one will do. Output will not be 1-100 consecutively, but random.
The WhenAll even prints the DoSomethingElse first since the Tasks are still starting/being executed.
The WaitAll will wait for the tasks before printing DoSomethingElse even though that slows down execution.
The Sleep, as mentioned, only adds time. You can put a thread to sleep in an async or 'sync' method, but the only thing it does is add execution time to your program. The only difference is that in an async other available threads in will pick up the slack (if available).
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Fiddle
{
public class Program
{
public static void Main(string[] args)
{
var ladieDo = new DoSomething();
ladieDo.RunAsyncAndDontAwaitCompletion();
Console.ReadLine();
Console.Clear();
ladieDo.RunAsyncAndAwaitCompletionOfAll();
Console.ReadLine();
}
public class DoSomething
{
public void RunAsyncAndDontAwaitCompletion()
{
var proces = Process.GetCurrentProcess();
Console.WriteLine("Threads:" + proces.Threads);
var ints = Enumerable.Range(1, 100).ToList();
// Will report back when its done, but wont wait for everything
Task.WhenAll(ints.Select(a => Task.Run(() => Console.WriteLine(a))));
// This line will be executed as soon as a Thread opens up, regardless of whether the above tasks have been completed
Console.WriteLine("DoSomethingElse");
Console.ReadLine();
}
public void RunAsyncAndAwaitCompletionOfAll()
{
var proces = Process.GetCurrentProcess();
Console.WriteLine("Threads:" + proces.Threads);
var ints = Enumerable.Range(1, 100).ToList();
// wait untill all these tasks are done
Task.WaitAll(ints.Select(a => Task.Run(() => Console.WriteLine(a))).ToArray());
// only once above tasks are done (regardless of order), write this
Console.WriteLine("DoSomethingElse");
}
public void EachTaskWillRunSynchronously()
{
var proces = Process.GetCurrentProcess();
Console.WriteLine("Threads:" + proces.Threads);
var ints = Enumerable.Range(1, 100).ToList();
foreach (int i in ints)
{
Console.WriteLine(i);
// The below line will just add more time in between each output
//Thread.Sleep(10);
}
Console.WriteLine("DoSomethingElse");
}
}
}
}
Output:
DoSomethingElse
1
2
3
4
6
7
8
9
10
11
12
13
14
5
....
89
90
91
92
93
94
88
84
97
98
96
100
99
95
DoSomethingElse

Doesn't ThreadPriority.Highest guarantee finishing before ThreadPriority.Lowest?

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Threading;
namespace ConsoleApplication58
{
class MultiThreading
{
static void Main(string[] args)
{
MultiThreading mul = new MultiThreading();
Thread t1 = new Thread(new ThreadStart(mul.WriteX));
Thread t2 = new Thread(new ThreadStart(mul.WriteO));
t1.Priority = ThreadPriority.Lowest;
t2.Priority = ThreadPriority.Highest;
t1.Start();
t2.Start();
}
private void WriteX()
{
for (int i = 0; i < 300; i++)
{
Console.Write("X");
}
}
private void WriteO()
{
for (int i = 0; i < 300; i++)
{
Console.Write("O");
}
}
}
}
When I execute above code, I expect X's end of the printing job because I gave that method lowest priority but sometimes I get O's at the end. I mean doesn't giving high priority to 2nd thread guarantee it will finish sooner?
There is no practical guarantee about thread scheduling induced by the priority setting.
For example, the high priority thread could block on IO or a page fault. Then, another thread can execute.
This is not a good way to synchronize threads.
Three hundreds operations for CPU is part of second. I recommend you to think about your problem and applications as a statistical experiment.
Repeat your experiment many times, because running few of them is not trustworthy. It happens because population (number of iterations it is not big enough). I tried to run this application few times, but with 3 000 000 iterations and I let to write for 45 seconds (while(true) and Thread.Sleep(45000)). Task with higher priority finished first.
Your OS, CPU and PC configuration does matter.

How to work with the queue using the task factory

There is a queue. There is a function that processes messages from this queue. This function takes the message from the queue, start new task to process the next message, waiting data from other sources, and then carries out the calculation.
This is example
using System;
using System.Diagnostics;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace TestTaskFactory
{
class Program
{
static int Data = 50;
static int ActiveTasksNumber = 0;
static int MaxActiveTasksNumber = 0;
static Stopwatch clock = new Stopwatch();
static object locker = new object();
static object locker2 = new object();
static void Main(string[] args)
{
clock.Start();
Task.Factory.StartNew(() => DoWork());
while (true)
{
Thread.Sleep(10000);
}
}
public static void DoWork()
{
//imitation of geting message from some queue
int message = GetMessageFromQueue();
lock (locker2)
{
ActiveTasksNumber++;
MaxActiveTasksNumber = Math.Max(MaxActiveTasksNumber,
ActiveTasksNumber);
Console.Write("\r" + message + " ");
}
//Run new task to work with next message
Task.Factory.StartNew(() => DoWork());
//imitation wait some other data
Thread.Sleep(3000);
//imitation of calculations with message
int tmp = 0;
for (int i = 0; i < 30000000; i++)
{
tmp = Math.Max(message, i);
}
lock (locker2)
{
ActiveTasksNumber--;
}
}
public static int GetMessageFromQueue()
{
lock (locker)
{
if (Data == 0)
{
//Queue is empty. All tasks completed except one
//that is waiting for new data
clock.Stop();
Console.WriteLine("\rMax active tasks number = "
+ MaxActiveTasksNumber
+ "\tTime = " + clock.ElapsedMilliseconds + "ms");
Console.Write("Press key to run next iteration");
clock.Reset();
Console.ReadKey();
Console.Write(" ");
//In queue received new data. Processing repeat
clock.Start();
ActiveTasksNumber = 0;
MaxActiveTasksNumber = 0;
Data = 50;
}
Data--;
return Data;
}
}
}
}
My guess, when the queue is empty, all tasks are completed except one task that awaits the new data. When data arrives in the queue the calculations are repeated.
But if you look at the results , every time the number of simultaneously running tasks increases.
Why is this happening?
Test results
Your approach is wrong.
First of all, where is your Queue?
For any jobs you want to queue in a concurrent environment, use the ConcurrentQueue.
The concurrent queue, is used in this fashion, it doesn't need to be locked at any time.
// To create your Queue
ConcurrentQueue<string> queue = new ConcurrentQueue<string>();
// To add objects to your Queue
queue.Enqueue("foo");
// To deque items from your Queue
String bar;
queue.TryDequeue(out bar);
// To loop a process until your Queue is empty
while(!queue.IsEmpty)
{
String bar;
queue.TryDequeue(out bar);
}
Next is how you are incrementing and decrementing your counters, there is a far better way of doing it which is thread safe. Again, the data doesn't need to be locked.
// Change your data type from int to long
static long ActiveTasksNumber = 0;
static long MaxActiveTasksNumber = 0;
// To increment the values in a Thread safe fashion:
Interlocked.Increment(ref ActiveTasksNumber);
// To decrement:
Interlocked.Decrement(ref MaxActiveTasksNumber);
Implement what I've shown you, and it should make your problems disappear
Edit:
Namespaces
using System.Collections.Concurrent;
using System.Threading;
To expand on my comment:
You have, in essence, this:
public static void DoWork()
{
// imitation of geting message from some queue
int message = GetMessageFromQueue();
// Run new task to work with next message
Task.Factory.StartNew(() => DoWork());
// do some work
}
Your code is going to get the first message, start a task to work with the next item, and then do its work. While the first task is working, the second gets an item and spawns yet another task to get an item from the queue. So now you have two threads supposedly doing work and a third that's going to spawn yet another, etc . . .
Nothing in your code stops it from creating a new task for every item in the queue.
If your queue started with 38 things, it's highly likely that you'll end up with 38 concurrent tasks.
You need to limit the number of tasks you're running at the same time. There are many ways to do that. Perhaps the easiest is a simple producer-consumer model using BlockingCollection.

How can i use AutoResetEventHandler to signal Main thread function to start threads again once the first set of worker threads are done processing

Requirement :- At any given point of time only 4 threads should be calling four different functions. As soon as these threads complete, next available thread should call the same functions.
Current code :- This seems to be the worst possible way to achieve something like this. While(True) will cause unnecessary CPU spikes and i could see CPU rising to 70% when running the following code.
Question :- How can i use AutoResetEventHandler to signal Main thread Process() function to start next 4 threads again once the first 4 worker threads are done processing without wasting CPU cycles. Please suggest
public class Demo
{
object protect = new object();
private int counter;
public void Process()
{
int maxthread = 4;
while (true)
{
if (counter <= maxthread)
{
counter++;
Thread t = new Thread(new ThreadStart(DoSomething));
t.Start();
}
}
}
private void DoSomething()
{
try
{
Thread.Sleep(50000); //simulate long running process
}
finally
{
lock (protect)
{
counter--;
}
}
}
You can use TPL to achieve what you want in a simpler way. If you run the code below you'll notice that an entry is written after each thread terminates and only after all four threads terminate the "Finished batch" entry is written.
This sample uses the Task.WaitAll to wait for the completion of all tasks. The code uses an infinite loop for illustration purposes only, you should calculate the hasPendingWork condition based on your requirements so that you only start a new batch of tasks if required.
For example:
private static void Main(string[] args)
{
bool hasPendingWork = true;
do
{
var tasks = InitiateTasks();
Task.WaitAll(tasks);
Console.WriteLine("Finished batch...");
} while (hasPendingWork);
}
private static Task[] InitiateTasks()
{
var tasks = new Task[4];
for (int i = 0; i < tasks.Length; i++)
{
int wait = 1000*i;
tasks[i] = Task.Factory.StartNew(() =>
{
Thread.Sleep(wait);
Console.WriteLine("Finished waiting: {0}", wait);
});
}
return tasks;
}
One other thing, from the textual requirement section on your question I'm lead to believe that a batch of four new threads should only start after all previously four threads completed. However the code you posted is not compatible with that requirement, since it starts a new thread immediately after a previous thread terminate. You should clarify what exactly is your requirement.
UPDATE:
If you want to start a thread immediately after one of the four threads terminate you can still use TPL instead of starting new threads explicitly but you can limit the number of running threads to four by using a SemaphoreSlim. For example:
private static SemaphoreSlim TaskController = new SemaphoreSlim(4);
private static void Main(string[] args)
{
var random = new Random(570);
while (true)
{
// Blocks thread without wasting CPU
// if the number of resources (4) is exhausted
TaskController.Wait();
Task.Factory.StartNew(() =>
{
Console.WriteLine("Started");
Thread.Sleep(random.Next(1000, 3000));
Console.WriteLine("Completed");
// Releases a resource meaning TaskController.Wait will unblock
TaskController.Release();
});
}
}

Return values from two long running methods, using threads

I have a thread that connects to two network resources. Each time I attempt a connection, it can take 10 seconds for a reply.
void MyThread()
{
//this takes ten seconds
Resource r1 = MySystem.GetResource(ipAddress1);
//this takes ten seconds
Resource r2 = MySystem.GetResource(ipAddress2);
//do stuff with ep1 and ep2
}
Total time is twenty seconds, but I'd really like it to take only ten seconds -- launching threads each time I call GetResource, receiving a reply and then joining each thread to return control.
What's the best way to do this? Launch two threads that each return a value? Anonymous methods that take references to local variables? My head is spinning. Code is appreciated.
How about
Resource r1 = null; // need to initialize, else compiler complains
Resource r2 = null;
ThreadStart ts1 = delegate {
r1 = MySystem.GetResource(ipAddress1);
};
ThreadStart ts2 = delegate {
r2 = MySystem.GetResource(ipAddress2);
};
Thread t1 = new Thread(ts1);
Thread t2 = new Thread(ts2);
t1.Start();
t2.Start();
// do some useful work here, while the threads do their thing...
t1.Join();
t2.Join();
// r1, r2 now setup
Short and sweet.
The easiest way that occurs to me to do so is to parallelize one of the calls on a worker thread while having the main thread perform the second initialization and wait. The following snipet should help to illustrate:
ManualResetEvent r1Handle = new ManualResetEvent(false);
Resource r1 = null;
Resource r2 = null;
// Make the threadpool responsible for populating the
// first resource.
ThreadPool.QueueUserWorkItem( (state) =>
{
r1 = MySystem.GetResource(ipAddress1);
// Set the wait handle to signal the main thread that
// the work is complete.
r1Handle.Set();
});
// Populate the second resource.
r2 = MySystem.GetResource(ipAddress2);
// Wait for the threadpool worker to finish.
r1Handle.WaitOne();
// ... Do more stuff
For a more detailed discussion of thread synchronization techniques, you may wish to visit the MSDN article on the topic: http://msdn.microsoft.com/en-us/library/ms173179.aspx
These are always fun questions to ponder, and of course there's multiple ways to solve it.
One approach that's worked well for me is to provide a callback method that each thread uses to pass back results and status. In the following example, I use a List to keep track of running threads and put the results in a Dictionary.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
using System.Timers;
namespace ConsoleApplication1
{
class Program
{
static Dictionary threadResults = new Dictionary();
static int threadMax = 2;
static void Main(string[] args)
{
List<Thread> runningThreads = new List<Thread>();
for (int i = 0; i < threadMax; i++)
{
Worker worker = new Worker();
worker.Callback = new Worker.CallbackDelegate(ThreadDone);
Thread workerThread = new Thread(worker.DoSomething);
workerThread.IsBackground = true;
runningThreads.Add(workerThread);
workerThread.Start();
}
foreach (Thread thread in runningThreads) thread.Join();
}
public static void ThreadDone(int threadIdArg, object resultsArg)
{
threadResults[threadIdArg] = resultsArg;
}
}
class Worker
{
public delegate void CallbackDelegate(int threadIdArg, object resultArg);
public CallbackDelegate Callback { get; set; }
public void DoSomething()
{
// do your thing and put it into results
object results = new object();
int myThreadId = Thread.CurrentThread.ManagedThreadId;
Callback(myThreadId, results);
}
}
}
Try this on MSDN: "Asynchronous programming using delegates."
http://msdn.microsoft.com/en-us/library/22t547yb.aspx
-Oisin

Categories

Resources