Delay in parallel computing - c#

I'm using parallel.for to launch in many threads a external program. But despite the fact that these are separate threads I need implement sth like delay. E.g. 2 threads want to launch this external program at the same moment - then one of them should wait and start e.g. 10 sec after second thread.
Is it possible?

It's possible, but given the information you've provided it seems pointless... you're enforcing single-threaded execution of the external program, so you might as well have a single thread executing it. If Thread 2 has to wait for Thread 1 in order to start the "external program," then just let Thread 1 do all the work since it already knows when it started the "external program."
The only benefit you will get from a multi-threaded approach is if you have a bunch of processing that you need to do prior to executing the "external program" and that processing has to be a good candidate for concurrent execution.
Update
OK, there are a couple of ways to do this with only one extra thread in order to keep your Main/GUI thread responsive. The first approach is a simple lock around the external resource which you're interacting with:
public class ExternalResourceHandler
{
private readonly ExternalResource _resource;
private readonly object _sync = new object();
// constructors
// ...
// other methods
public void PerformExternalOperation()
{
lock(_sync)
{
Result result = _resource.Execute();
// do soemthing with the result
}
}
}
Here are 3 multi-threaded version for executing the code:
Using a Parallel.For method: recommended if the external program takes a short amount of time to execute- I'd suggest for things under 25 seconds (although this is not necessarily a "correct" number).
Using a ThreadPool: again, I'd recommend for things that take less than 25 seconds (with the same reservation as above).
Using a Thread: this would be recommended if the operation runs longer (i.e. more than 25 seconds, but it would do just as good if it's under 25 seconds).
Here are some examples (not necessarily functional, mostly meant to give you an idea of the different approaches):
public class Program
{
public static ExternalResourceHandler _erh = new ExternalResourceHandler();
static int Main()
{
Console.WriteLine("Type 'exit' to stop; 'parallel', 'pool' or 'thread' for the corresponding execution version.");
string input = Console.ReadLine();
while(input != "exit")
{
switch(input)
{
case "parallel":
// Run the Parallel.For version
ParallelForVersion();
break;
caase "pool":
// Run the threadpool version
ThreadPoolVersion();
break;
case "thread":
// Run the thread version
ThreadVersion();
break;
default:
break;
}
input = Console.ReadLine();
}
return 0;
}
public static void ParallelForVersion()
{
Parallel.For(0, 1, i =>
{
_erh.PerformExternalOperation();
});
}
public static void ThreadPoolVersion()
{
ThreadPool.QueueUserWorkItem(o=>
{
_erh.PerformExternalOperation();
});
}
public static void ThreadVersion()
{
Thread t = new Thread(()=>
{
_erh.PerformExternalOperation();
});
t.IsBackground = true;
t.Start();
}
}
The other option is to employ the Producer/Consumer design pattern where your ExternalResourceHandler is the consumer and it processes requests to the external resource from a thread-safe queue. Your main thread just places requests on the queue and immediately returns back to work. Here is an example:
public class ExternalResourceHandler
{
private volatile boolean _running;
private readonly ExternalResource _resource;
private readonly BlockingQueue<Request> _requestQueue;
public ExternalResourceHandler( BlockingQueue<Request> requestQueue)
{
_requestQueue = requestQueue;
_running = false;
}
public void QueueRequest(Request request)
{
_requestQueue.Enqueue(request);
}
public void Run()
{
_running = true;
while(_running)
{
Request request = null;
if(_requestQueue.TryDequeue(ref request) && request!=null)
{
_resource.Execute(request);
}
}
}
// methods to stop the handler (i.e. set the _running flag to false)
}
Your main would look like this:
public class Program
{
public static ExternalResourceHandler _erh = new ExternalResourceHandler();
static int Main()
{
Thread erhThread = new Thread(()=>{_erh.Run();});
erhThread.IsBackground = true;
erhThread.Start();
Console.WriteLine("Type 'exit' to stop or press enter to enqueue another request.");
string input = Console.ReadLine();
while(input != "exit")
{
_erh.EnqeueRequest(new Request());
input = Console.ReadLine();
}
// Stops the erh by setting the running flag to false
_erh.Stop();
// You may also need to interrupt the thread in order to
// get it out of a blocking state prior to calling Join()
erhThread.Join();
return 0;
}
}
As you see: in both cases all the work for the external handler is forced on a single thread yet your main thread still remains responsive.

Look at producer-consumer pattern. The first thread produces the information "external progam launched" the second thread consumes it, waits 10 second and then launches the external program.

Related

Why "Console.Readline()" takes 5% of CPU while "while (true)" takes 30%, when trying to keep a Thread open?

I'm doing this test on my machine. Obviously cpu % will vary, but I'm more interested in understanding what's going on.
I simply create a new, blank, Console .Net Framework application (--not-- .net core).
Here is the source of "Program.cs":
static void Main(string[] args)
{
myClass myClass = new myClass();
myClass.myInifiniteMethodAsync();
Console.WriteLine("Launched...");
Console.ReadLine();
}
Then this is the source of myClass:
class myClass
{
public async Task myInifiniteMethodAsync()
{
await Task.Run(() => myInfiniteMethod());
}
public void myInfiniteMethod()
{
//do some things but keep this thread holded...
//bool keepRunning = true;
//while (keepRunning) { } <--- this one takes 30% cpu...
Console.ReadLine(); // <--- this one takes 5% cpu...
}
}
I need the IfiniteMethod to stay always there, "holding" the thread forever.
If I use the "while(true)" method, the CPU raises up to 30%.
If I use the Console.ReadLine() method, the CPU stays around 5%.
I would like to understand why, and if there's a better method to hold a thread.
In order to answer your question, we need to understand how the operating system performs each of these lines.
For while(true) {}, the compiled code consists of a comparison of the variable keepRunning, and then makes a conditional jump. The important part of it is that the operating system continuously performs instructions.
However, for Console.ReadLine(), it waits for user input, for which the operating system only waits for a hardware interrupt (key press), and doesn't need to continuously perform instructions.
Therefore, the loop version requires the operating system to give your program as much time as it can, because it "wants" to run many instructions, whereas for the input version the operating system only waits for a hardware interrupt, and the program doesn't need to perform any other instructions.
What"s the real purpose of the infinite method? Waiting implementation depends on it. But in most cases it's a Producer-Consumer programming pattern. I suggest BlockingCollection or brand-new Channel in that case.
Here's an example for BlockingCollection
static void Main(string[] args)
{
BlockingCollection<string> messages = new BlockingCollection<string>();
Consumer consumer = new Consumer(messages);
consumer.Start();
Console.WriteLine("Consumer launched");
Console.WriteLine("Starting main loop (hit Enter to exit)");
while (true)
{
string m = Console.ReadLine();
if (m.Length > 0)
messages.Add(m);
else
break;
}
consumer.Stop();
Console.WriteLine("Main loop finished");
Console.ReadKey();
}
public class Consumer
{
private BlockingCollection<string> _messages;
public Consumer(BlockingCollection<string> messages)
{
_messages = messages;
}
private void RunLoop()
{
Console.WriteLine("Starting consumer loop");
foreach (string message in _messages.GetConsumingEnumerable()) // <--- this one takes 0% cpu
{
Console.WriteLine("Got message: {0}", message);
}
Console.WriteLine("Consumer loop finished");
}
public void Start()
{
Task.Run(() => RunLoop());
}
public void Stop()
{
_messages.CompleteAdding();
}
}
And the output
Consumer launched
Starting main loop (hit Enter to exit)
Starting consumer loop
test message 1
Got message: test message 1
blablabla
Got message: blablabla
Main loop finished
Consumer loop finished
You need no async/await here but only one Task.
Ok, as for the "better method" to hold the thread while having a low cpu consume, I found an answer: the ManualResetEvent !
Instead of:
while (true); //or equivalent while (somevar)
Now I simply do:
ManualResetEvent resetEvent = new ManualResetEvent(false);
resetEvent.WaitOne();

Is there a way to abort a thread and then open it again with a new variable?

I want to open a thread to do the things it needs to do until a new command is given by the user. Then this thread should either close or receive a new command.
I have seen many posts that sending a variable to a running thread is hard, that is why I decided to kill the thread and start it again with the new variable.
I used the following post: https://stackoverflow.com/a/1327377 but without success. When I start the thread again (after it has done abort()) it gives me an exception: System.Threading.ThreadStateException.
private static Thread t = new Thread(Threading);
private static bool _running = false;
static void Main(string[] args)
{
[get arg]
if (CanRedo(arg))
{
if (t.IsAlive)
{
_running = false;
t.Interrupt();
if (t.Join(2000)) // with a '!' like in the post, abort() would not be called
{
t.Abort();
}
}
_running = true;
t.Start(arg); // gives System.Threading.ThreadStateException
}
}
private static void Threading(object obj)
{
_stopped = false;
string arg = obj.ToString();
while(_running)
{
if (bot._isDone)
{
ExecuteInstruction(arg);
}
}
}
What am I doing wrong?
I'm going to guess that you don't literally mean to abort the thread and start that same thread again. That's because if we start a thread to do some work we don't care which thread it is. If you cancel one thing and start something else, you probably don't care if it's the same thread or a different one. (In fact it's probably better if you don't care. If you need precise control over which thread is doing what then something has gotten complicated.) You can't "abort" a thread and restart it anyway.
Regarding Thread.Abort:
The Thread.Abort method should be used with caution. Particularly when you call it to abort a thread other than the current thread, you do not know what code has executed or failed to execute when the ThreadAbortException is thrown, nor can you be certain of the state of your application or any application and user state that it is responsible for preserving. For example, calling Thread.Abort may prevent static constructors from executing or prevent the release of unmanaged resources.
It's like firing an employee by teleporting them out of the building without warning. What if they were in the middle of a phone call or carrying a stack of papers? That might be okay in an emergency, but it wouldn't be a normal way to operate. It would be better to let the employee know that they need to wrap up what they're doing immediately. Put down what you're carrying. Tell the customer that you can't finish entering their order and they'll need to call back.
You're describing an expected behavior, so it would be better to cancel the thread in an orderly way.
That's where we might use a CancellationToken. In effect you're passing an object to the thread and telling it to check it from time to time to see if it should cancel what it's doing.
So you could start your thread like this:
class Program
{
static void Main(string[] args)
{
using (var cts = new CancellationTokenSource())
{
ThreadPool.QueueUserWorkItem(DoSomethingOnAnotherThread, cts.Token);
// This is just for demonstration. It allows the other thread to run for a little while
// before it gets canceled.
Thread.Sleep(5000);
cts.Cancel();
}
}
private static void DoSomethingOnAnotherThread(object obj)
{
var cancellationToken = (CancellationToken) obj;
// This thread does its thing. Once in a while it does this:
if (cancellationToken.IsCancellationRequested)
{
return;
}
// Keep doing what it's doing.
}
}
Whatever the method is that's running in your separate thread, it's going to check IsCancellationRequested from time to time. If it's right in the middle of doing something it can stop. If it has unmanaged resources it can dispose them. But the important thing is that you can cancel what it does in a predictable way that leaves your application in a known state.
CancellationToken is one way to do this. In other really simple scenarios where the whole thing is happening inside one class you could also use a boolean field or property that acts as a flag to tell the thread if it needs to stop. The separate thread checks it to see if cancellation has been requested.
But using the CancellationToken makes it more manageable if you want to refactor and now the method executing on another thread is a in separate class. When you use a known pattern it makes it easier for the next person to understand what's going on.
Here's some documentation.
What about doing it this way:
private static Task t = null;
private static CancellationTokenSource cts = null;
static void Main(string[] args)
{
[get arg]
if (CanRedo(out var arg))
{
if (t != null)
{
cts.Cancel();
t.Wait();
}
// Set up a new task and matching cancellation token
cts = new CancellationTokenSource();
t = Task.Run(() => liveTask(arg, cts.Token));
}
}
private static void liveTask(object obj, CancellationToken ct)
{
string arg = obj.ToString();
while(!ct.IsCancellationRequested)
{
if (bot._isDone)
{
ExecuteInstruction(arg);
}
}
}
Tasks are cancellable, and I can see nothing in your thread that requires the same physical thread to be re-used.

How to call a method on a running thread?

On a console application, i am currently starting an array of threads. The thread is passed an object and running a method in it. I would like to know how to call a method on the object inside the individual running threads.
Dispatcher doesn't work. SynchronizationContext "Send" runs on the calling thread and "Post" uses a new thread. I would like to be able to call the method and pass parameters on a running thread on the target thread it's running on and not the calling thread.
Update 2: Sample code
using System;
using System.Collections.Generic;
using System.Data.SqlClient;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace CallingFromAnotherThread
{
class Program
{
static void Main(string[] args)
{
var threadCount = 10;
var threads = new Thread[threadCount];
Console.WriteLine("Main on Thread " + Thread.CurrentThread.ManagedThreadId);
for (int i = 0; i < threadCount; i++)
{
Dog d = new Dog();
threads[i] = new Thread(d.Run);
threads[i].Start();
}
Thread.Sleep(5000);
//how can i call dog.Bark("woof");
//on the individual dogs and make sure they run on the thread they were created on.
//not on the calling thread and not on a new thread.
}
}
class Dog
{
public void Run()
{
Console.WriteLine("Running on Thread " + Thread.CurrentThread.ManagedThreadId);
}
public void Bark(string text)
{
Console.WriteLine(text);
Console.WriteLine("Barking on Thread " + Thread.CurrentThread.ManagedThreadId);
}
}
}
Update 1:
Using synchronizationContext.Send results to using the calling thread
Channel created
Main thread 10
SyncData Added for thread 11
Consuming channel ran on thread 11
Calling AddConsumer on thread 10
Consumer added consumercb78b. Executed on thread 10
Calling AddConsumer on thread 10
Consumer added consumer783c4. Executed on thread 10
Using synchronizationContext.Post results to using a different thread
Channel created
Main thread 10
SyncData Added for thread 11
Consuming channel ran on thread 11
Calling AddConsumer on thread 12
Consumer added consumercb78b. Executed on thread 6
Calling AddConsumer on thread 10
Consumer added consumer783c4. Executed on thread 7
The target thread must run the code "on itself" - or it is just accessing the object across threads. This is done with some form of event dispatch loop on the target thread itself.
The SynchronizationContext abstraction can and does support this if the underlying provider supports it. For example in either WinForms or WPF (which themselves use the "window message pump") using Post will "run on the UI thread".
Basically, all such constructs follow some variation of the pattern:
// On "target thread"
while (running) {
var action = getNextDelegateFromQueue();
action();
}
// On other thread
postDelegateToQueue(actionToDoOnTargetThread);
It is fairly simple to create a primitive queue system manually - just make sure to use the correct synchronization guards. (Although I am sure there are tidy "solved problem" libraries out there; including wrapping everything up into a SynchronizationContext.)
Here is a primitive version of the manual queue. Note that there may be is1 a race condition.. but, FWIW:
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
namespace DogPark
{
internal class DogPark
{
private readonly string _parkName;
private readonly Thread _thread;
private readonly ConcurrentQueue<Action> _actions = new ConcurrentQueue<Action>();
private volatile bool _isOpen;
public DogPark(string parkName)
{
_parkName = parkName;
_isOpen = true;
_thread = new Thread(OpenPark);
_thread.Name = parkName;
_thread.Start();
}
// Runs in "target" thread
private void OpenPark(object obj)
{
while (true)
{
Action action;
if (_actions.TryDequeue(out action))
{
Program.WriteLine("Something is happening at {0}!", _parkName);
try
{
action();
}
catch (Exception ex)
{
Program.WriteLine("Bad dog did {0}!", ex.Message);
}
}
else
{
// Nothing left!
if (!_isOpen && _actions.IsEmpty)
{
return;
}
}
Thread.Sleep(0); // Don't toaster CPU
}
}
// Called from external thread
public void DoItInThePark(Action action)
{
if (_isOpen)
{
_actions.Enqueue(action);
}
}
// Called from external thread
public void ClosePark()
{
_isOpen = false;
Program.WriteLine("{0} is closing for the day!", _parkName);
// Block until queue empty.
while (!_actions.IsEmpty)
{
Program.WriteLine("Waiting for the dogs to finish at {0}, {1} actions left!", _parkName, _actions.Count);
Thread.Sleep(0); // Don't toaster CPU
}
Program.WriteLine("{0} is closed!", _parkName);
}
}
internal class Dog
{
private readonly string _name;
public Dog(string name)
{
_name = name;
}
public void Run()
{
Program.WriteLine("{0} is running at {1}!", _name, Thread.CurrentThread.Name);
}
public void Bark()
{
Program.WriteLine("{0} is barking at {1}!", _name, Thread.CurrentThread.Name);
}
}
internal class Program
{
// "Thread Safe WriteLine"
public static void WriteLine(params string[] arguments)
{
lock (Console.Out)
{
Console.Out.WriteLine(arguments);
}
}
private static void Main(string[] args)
{
Thread.CurrentThread.Name = "Home";
var yorkshire = new DogPark("Yorkshire");
var thunderpass = new DogPark("Thunder Pass");
var bill = new Dog("Bill the Terrier");
var rosemary = new Dog("Rosie");
bill.Run();
yorkshire.DoItInThePark(rosemary.Run);
yorkshire.DoItInThePark(rosemary.Bark);
thunderpass.DoItInThePark(bill.Bark);
yorkshire.DoItInThePark(rosemary.Run);
thunderpass.ClosePark();
yorkshire.ClosePark();
}
}
}
The output should look about like the following - keep in mind that this will change when run multiples times due to the inherent nature of non-synchronized threads.
Bill the Terrier is running at Home!
Something is happening at Thunder Pass!
Something is happening at Yorkshire!
Rosie is running at Yorkshire!
Bill the Terrier is barking at Thunder Pass!
Something is happening at Yorkshire!
Rosie is barking at Yorkshire!
Something is happening at Yorkshire!
Rosie is running at Yorkshire!
Thunder Pass is closing for the day!
Thunder Pass is closed!
Yorkshire is closing for the day!
Yorkshire is closed!
There is nothing preventing a dog from performing at multiple dog parks simultaneously.
1 There is a race condition present and it is this: a park may close before the last dog action runs.
This is because the dog park thread dequeues the action before the action is run - and the method to close the dog park only waits until all the actions are dequeued.
There are multiple ways to address it, for instance:
The concurrent queue could first peek-use-then-dequeue-after-the-action, or
A separate volatile isClosed-for-real flag (set from the dog park thread) could be used, or ..
I've left the bug in as a reminder of the perils of threading..
A running thread is already executing a method. You cannot directly force that thread to leave the method and enter a new one. However, you could send information to that thread to leave the current method and do something else. But this only works if the executed method can react on that passed information.
In general, you can use threads to call/execute methods, but you cannot call a method ON a running thread.
Edit, based on your updates:
If you want to use the same threads to execute dog.run and dog.bark, and do it in the same objects, the you need to modify your code:
static void Main(string[] args)
{
var threadCount = 10;
var threads = new Thread[threadCount];
Console.WriteLine("Main on Thread " + Thread.CurrentThread.ManagedThreadId);
// keep the dog objects outside the creation block in order to access them later again. Always useful.
Dog[] dogs = New Dog[threadCount];
for (int i = 0; i < threadCount; i++)
{
dogs[i] = new Dog();
threads[i] = new Thread(d.Run);
threads[i].Start();
}
Thread.Sleep(5000);
//how can i call dog.Bark("woof") --> here you go:
for (int i = 0; i < threadCount; i++)
{
threads[i] = new Thread(d.Bark);
threads[i].Start();
}
// but this will create NEW threads because the others habe exited after finishing d.run, and habe been deleted. Is this a problem for you?
// maybe the other threads are still running, causing a parallel execution of d.run and d.bark.
//on the individual dogs and make sure they run on the thread they were created on.
//not on the calling thread and not on a new thread. -->
// instead of d.run, call d.doActions and loop inside that function, check for commands from external sources:
for (int i = 0; i < threadCount; i++)
{
threads[i] = new Thread(d.doActions);
threads[i].Start();
}
// but in this case there will be sequential execution of actions. No parallel run and bark.
}
Inside your dog class:
Enum class EnumAction
{
Nothing,
Run,
bark,
exit,
};
EnumAction m_enAction;
Object m_oLockAction;
void SetAction (EnumAction i_enAction)
{
Monitor.Enter (m_oLockAction);
m_enAction = i_enAction;
Monitor.Exit (m_oLockAction);
}
void SetAction (EnumAction i_enAction)
{
Monitor.Enter (m_oLockAction);
m_enAction = i_enAction;
Monitor.Exit (m_oLockAction);
}
Void doActions()
{
EnumAction enAction;
Do
{
Thread.sleep(20);
enAction = GetAction();
Switch(enAction)
{
Case EnumAction.run:
Run(); break;
Case ...
}
} while (enAction != EnumAction.exit);
}
Got it? ;-)
Sorry for any typos, I was typing on my mobile phone, and I usually use C++CLI.
Another advice: as you would read the variable m_enAction inside the thread and write it from outside, you need to ensure that it gets updated properly due to the access from different threads. The threads MUST NOT cache the variable in the CPU, otherwise they don't see it changing. Use locks (e.g. Monitor) to achieve that. (But do not use a Monitor on m_enAction, because you can use Monitors only on objects. Create a dummy object for this purpose.)
I have added the necessary code. Check out the differences between the edits to see the changes.
You cannot run second method while first method is running. If you want them to run in parallel you need another thread. However, your object needs to be thread safe.
Execution of thread simply means execution of sequence of instruction. Dispatcher is nothing else than an infinite loop that executes queued method one after another.
I recommend you to use tasks instead of threads. Use Parallel.ForEach to run Dog.Run method on each dog object instance. To run Bark method use Task.Run(dog.Bark).
Since you used running and barking dog as an example you could write your own "dispatcher". That means infinite loop that would execute all queued work. In that case you could have all dogs in single thread. Sounds weird, but you could have unlimited amount of dogs. At the end, only as many threads can be executed at the same time as many CPU cores is available

One of multiple Tasks acquires a lock in Mutex much longer than other Tasks do

SITUATION
Currently in my project I have 3 Workers that have a working loop inside, and one CommonWork class object, which contains Work methods (DoFirstTask, DoSecondTask, DoThirdTask) that Workers can call. Each Work method must be executed mutually exclusively in respect to each other method. Each of methods spawn more nested Tasks that are waited until they are finished.
PROBLEM
When all 3 Workers are started, 2 Workers perform somewhat at the same speed, but 3rd Worker is lagging behind or 1st Worker is super-fast, 2nd a bit slower and 3rd is very slow, it depends on real world.
BIZARRENESS
When only 2 Workers are working, they share the work nicely too, and perform at the same speed.
What's more interesting, that even 3rd Worker calls fewer number of CommonWork methods, and has the potential to perform more loop cycles, it does not. I tried to simulate that in the code below with condition:
if (Task.CurrentId.Value < 3)
When debugging, I found out, that 3rd Worker was waiting on acquiring a lock on a Mutex substantially longer than other Workers. Sometimes, other two Workers just work interchangingly, and the 3rd keeps waiting on Mutex.WaitOne(); I guess, without really entering it, because other Workers have no problem in acquiring that lock!
WHAT I TRIED ALREADY
I tried starting Worker Tasks as TaskCreateOptions.LongRunning, but nothing changed. I also tried making nested Tasks to be child Tasks by specifying TaskCreateOpions.AttachedToParent, thinking it might be related to local queues and scheduling, but apparently it is not.
SIMPLIFIED CODE
Below is the simplified code of my real-world application. Sad to say, I could not reproduce this situation in this simple example:
class Program
{
public class CommonWork
{
private Mutex _mutex;
public CommonWork() { this._mutex = new Mutex(false); }
private void Lock() { this._mutex.WaitOne(); }
private void Unlock() { this._mutex.ReleaseMutex(); }
public void DoFirstTask(int taskId)
{
this.Lock();
try
{
// imitating sync work from 3rd Party lib, that I need to make async
var t = Task.Run(() => {
Thread.Sleep(500); // sync work
});
... // doing some work here
t.Wait();
Console.WriteLine("Task {0}: DoFirstTask - complete", taskId);
}
finally { this.Unlock(); }
}
public void DoSecondTask(int taskId)
{
this.Lock();
try
{
// imitating sync work from 3rd Party lib, that I need to make async
var t = Task.Run(() => {
Thread.Sleep(500); // sync work
});
... // doing some work here
t.Wait();
Console.WriteLine("Task {0}: DoSecondTask - complete", taskId);
}
finally { this.Unlock(); }
}
public void DoThirdTask(int taskId)
{
this.Lock();
try
{
// imitating sync work from 3rd Party lib, that I need to make async
var t = Task.Run(() => {
Thread.Sleep(500); // sync work
});
... // doing some work here
t.Wait();
Console.WriteLine("Task {0}: DoThirdTask - complete", taskId);
}
finally { this.Unlock(); }
}
}
// Worker class
public class Worker
{
private CommonWork CommonWork { get; set; }
public Worker(CommonWork commonWork)
{ this.CommonWork = commonWork; }
private void Loop()
{
while (true)
{
this.CommonWork.DoFirstTask(Task.CurrentId.Value);
if (Task.CurrentId.Value < 3)
{
this.CommonWork.DoSecondTask(Task.CurrentId.Value);
this.CommonWork.DoThirdTask(Task.CurrentId.Value);
}
}
}
public Task Start()
{
return Task.Run(() => this.Loop());
}
}
static void Main(string[] args)
{
var work = new CommonWork();
var client1 = new Worker(work);
var client2 = new Worker(work);
var client3 = new Worker(work);
client1.Start();
client2.Start();
client3.Start();
Console.ReadKey();
}
} // end of Program
The solution was to use new SempahoreSlim(1) instead of Mutex (or simple lock, or Monitor). Only using SemaphoreSlim made Thread Scheduling to be round-robin, and therefore did not make some Threads/Tasks "special" in respect to other threads. Thanks I3arnon.
If someone could comment why it is so, I would appreciate it.

How can i use AutoResetEventHandler to signal Main thread function to start threads again once the first set of worker threads are done processing

Requirement :- At any given point of time only 4 threads should be calling four different functions. As soon as these threads complete, next available thread should call the same functions.
Current code :- This seems to be the worst possible way to achieve something like this. While(True) will cause unnecessary CPU spikes and i could see CPU rising to 70% when running the following code.
Question :- How can i use AutoResetEventHandler to signal Main thread Process() function to start next 4 threads again once the first 4 worker threads are done processing without wasting CPU cycles. Please suggest
public class Demo
{
object protect = new object();
private int counter;
public void Process()
{
int maxthread = 4;
while (true)
{
if (counter <= maxthread)
{
counter++;
Thread t = new Thread(new ThreadStart(DoSomething));
t.Start();
}
}
}
private void DoSomething()
{
try
{
Thread.Sleep(50000); //simulate long running process
}
finally
{
lock (protect)
{
counter--;
}
}
}
You can use TPL to achieve what you want in a simpler way. If you run the code below you'll notice that an entry is written after each thread terminates and only after all four threads terminate the "Finished batch" entry is written.
This sample uses the Task.WaitAll to wait for the completion of all tasks. The code uses an infinite loop for illustration purposes only, you should calculate the hasPendingWork condition based on your requirements so that you only start a new batch of tasks if required.
For example:
private static void Main(string[] args)
{
bool hasPendingWork = true;
do
{
var tasks = InitiateTasks();
Task.WaitAll(tasks);
Console.WriteLine("Finished batch...");
} while (hasPendingWork);
}
private static Task[] InitiateTasks()
{
var tasks = new Task[4];
for (int i = 0; i < tasks.Length; i++)
{
int wait = 1000*i;
tasks[i] = Task.Factory.StartNew(() =>
{
Thread.Sleep(wait);
Console.WriteLine("Finished waiting: {0}", wait);
});
}
return tasks;
}
One other thing, from the textual requirement section on your question I'm lead to believe that a batch of four new threads should only start after all previously four threads completed. However the code you posted is not compatible with that requirement, since it starts a new thread immediately after a previous thread terminate. You should clarify what exactly is your requirement.
UPDATE:
If you want to start a thread immediately after one of the four threads terminate you can still use TPL instead of starting new threads explicitly but you can limit the number of running threads to four by using a SemaphoreSlim. For example:
private static SemaphoreSlim TaskController = new SemaphoreSlim(4);
private static void Main(string[] args)
{
var random = new Random(570);
while (true)
{
// Blocks thread without wasting CPU
// if the number of resources (4) is exhausted
TaskController.Wait();
Task.Factory.StartNew(() =>
{
Console.WriteLine("Started");
Thread.Sleep(random.Next(1000, 3000));
Console.WriteLine("Completed");
// Releases a resource meaning TaskController.Wait will unblock
TaskController.Release();
});
}
}

Categories

Resources