I have two threads reaching a point in the server program at slightly differing times, both sending a string. I want the server to pause at this point until both threads have received a string and then continue. Currently I am using Console.ReadKey(); in order to "pause" the thread. But this isn't a solution as I need to press a key twice (one for each thread) in order to progress.
Is there a possibility of having a global counter in the program class which is accessible and editable by all threads at all times? A similiar concept to a ConcurrentDictionary. That way I can distinguish between threads based on which thread sent a string first and make the program hang until the counter is satisfied that both clients have 'answered'.
class Program
{
public static bool isFirstThread = false;
static void Main(string[] args)
{
runServer();
}
static void runServer()
{
//server setup
Thread[] threadsArray = new Thread[2];
int i = 0;
try
{
while(true) //game loop
{
Socket connection;
connection = listener.AcceptSocket();
threadRequest = new Handler();
if(i==0) //first thread
{
threadsArray[i] = new Thread(() => threadRequest.clientInteraction(connection, true);
}
else //not first thread
{
threadsArray[i] = new Thread(() => threadRequest.clientInteraction(connection, false);
}
threadsArray[i].Start();
i++;
}
}
catch(Exception e)
{
Console.WriteLine("Exception: " + e.ToString());
}
}
}
class Handler
{
public void clientInteraction(Socket connection, bool isFirstThread)
{
string pAnswer = string.Empty;
//setup streamReaders and streamWriters
while(true) //infinite game loop
{
//read in a question and send to both threads.
pAnswer = sr.ReadLine();
Console.WriteLine(pAnswer);
Console.ReadKey(); //This is where I need the program to hang
awardPoints();
}
}
}
This is a rough idea of what my code is doing, I've chopped quite a bit to avoid question bloat so there might be a couple of things that are in error that I've missed.
I could in theory just set a timer from when the question string is sent from the server, but I would rather not at this stage.
Any thoughts or pointers would be much appreciated. Thanks in advance.
Use a System.Threading.Barrier, which is designed for this exact purpose: making each thread in a group of threads wait until all of them have reached some point in their computation. Initialize it in runServer() like this:
Barrier barrier = new Barrier(2);
And do this at the end of each thread:
barrier.SignalAndWait();
Related
I'm doing this test on my machine. Obviously cpu % will vary, but I'm more interested in understanding what's going on.
I simply create a new, blank, Console .Net Framework application (--not-- .net core).
Here is the source of "Program.cs":
static void Main(string[] args)
{
myClass myClass = new myClass();
myClass.myInifiniteMethodAsync();
Console.WriteLine("Launched...");
Console.ReadLine();
}
Then this is the source of myClass:
class myClass
{
public async Task myInifiniteMethodAsync()
{
await Task.Run(() => myInfiniteMethod());
}
public void myInfiniteMethod()
{
//do some things but keep this thread holded...
//bool keepRunning = true;
//while (keepRunning) { } <--- this one takes 30% cpu...
Console.ReadLine(); // <--- this one takes 5% cpu...
}
}
I need the IfiniteMethod to stay always there, "holding" the thread forever.
If I use the "while(true)" method, the CPU raises up to 30%.
If I use the Console.ReadLine() method, the CPU stays around 5%.
I would like to understand why, and if there's a better method to hold a thread.
In order to answer your question, we need to understand how the operating system performs each of these lines.
For while(true) {}, the compiled code consists of a comparison of the variable keepRunning, and then makes a conditional jump. The important part of it is that the operating system continuously performs instructions.
However, for Console.ReadLine(), it waits for user input, for which the operating system only waits for a hardware interrupt (key press), and doesn't need to continuously perform instructions.
Therefore, the loop version requires the operating system to give your program as much time as it can, because it "wants" to run many instructions, whereas for the input version the operating system only waits for a hardware interrupt, and the program doesn't need to perform any other instructions.
What"s the real purpose of the infinite method? Waiting implementation depends on it. But in most cases it's a Producer-Consumer programming pattern. I suggest BlockingCollection or brand-new Channel in that case.
Here's an example for BlockingCollection
static void Main(string[] args)
{
BlockingCollection<string> messages = new BlockingCollection<string>();
Consumer consumer = new Consumer(messages);
consumer.Start();
Console.WriteLine("Consumer launched");
Console.WriteLine("Starting main loop (hit Enter to exit)");
while (true)
{
string m = Console.ReadLine();
if (m.Length > 0)
messages.Add(m);
else
break;
}
consumer.Stop();
Console.WriteLine("Main loop finished");
Console.ReadKey();
}
public class Consumer
{
private BlockingCollection<string> _messages;
public Consumer(BlockingCollection<string> messages)
{
_messages = messages;
}
private void RunLoop()
{
Console.WriteLine("Starting consumer loop");
foreach (string message in _messages.GetConsumingEnumerable()) // <--- this one takes 0% cpu
{
Console.WriteLine("Got message: {0}", message);
}
Console.WriteLine("Consumer loop finished");
}
public void Start()
{
Task.Run(() => RunLoop());
}
public void Stop()
{
_messages.CompleteAdding();
}
}
And the output
Consumer launched
Starting main loop (hit Enter to exit)
Starting consumer loop
test message 1
Got message: test message 1
blablabla
Got message: blablabla
Main loop finished
Consumer loop finished
You need no async/await here but only one Task.
Ok, as for the "better method" to hold the thread while having a low cpu consume, I found an answer: the ManualResetEvent !
Instead of:
while (true); //or equivalent while (somevar)
Now I simply do:
ManualResetEvent resetEvent = new ManualResetEvent(false);
resetEvent.WaitOne();
I have two threads. How to get data from thread1 to thread2. It means, whe thread1 has done its work, it has some data, and this data must be used in the second "thread2". How to realize it ?
Here is code, but what to do..now ?
static void Main(string[] args)
{
Thread t1 = new Thread(thread1);
t1.Start();
Thread t2 = new Thread(thread2);
t2.Start();
}
static void thread1()
{
string newstring="123";
}
static void thread2()
{
//what to do here...what code will be here?
Console.WriteLine(newstring);
}
In thread1 can be whatever, but i need to get this "whatever", than i can use it in thread2
Data, which is used by both Thread must be commonly shared between both thread.
usually it is called common resource.
One this you must note that you have to achieve synchronization here.
As both threads are running independently and also reading/writing common data, chances of Race Condition is pretty high. To prevent such cases, you must implement synchronization on reading/writing data (on common object).
refere below code, where CommonResource is common between both threads and synchronization has been achieved by locking
In your example, one thread is writing data and other thread is reading data. If we don't implement Synchronization, there are chances that while thread 1 is writing new data, but thread 2 (because it is not waiting for thread 1 to complete it's task first) will bring old data (or invalid data).
Situation goes worst when there are multiple threads which are writing data, without waiting for other threads to complete their writing.
public class CommonResourceClass
{
object lockObj;
//Note: here main resource is private
//(thus not in scope of any thread)
string commonString;
//while prop is public where we have lock
public string CommonResource
{
get
{
lock (lockObj)
{
Console.WriteLine(DateTime.Now.ToString() + " $$$$$$$$$$$$$$$ Reading");
Thread.Sleep(1000 * 2);
return commonString;
}
}
set
{
lock (lockObj)
{
Console.WriteLine(DateTime.Now.ToString() + " ************* Writing");
Thread.Sleep(1000 * 5);
commonString = value;
}
}
}
public CommonResourceClass()
{
lockObj = new object();
}
}
and Thread calling be like
static CommonResourceClass commonResourceClass;
static void Main(string[] args)
{
commonResourceClass = new CommonResourceClass();
Thread t1 = new Thread(ThreadOneRunner);
Thread t2 = new Thread(ThreadTwoRunner);
t1.Start();
t2.Start();
}
static void ThreadOneRunner()
{
while(true)
{
Console.WriteLine(DateTime.Now.ToString() + " *******Trying To Write");
commonResourceClass.CommonResource = "Written";
Console.WriteLine(DateTime.Now.ToString() + " *******Writing Done");
}
}
static void ThreadTwoRunner()
{
while(true)
{
Console.WriteLine(DateTime.Now.ToString() + " $$$$$$$Trying To Read");
string Data = commonResourceClass.CommonResource;
Console.WriteLine(DateTime.Now.ToString() + " $$$$$$$Reading Done");
}
}
Output of it:
Note That, reading is taking 2 seconds and writing is taking 5 seconds, so reading is supposed to be faster. But if writing is going on, reading must wait till writing done.
you can clearly see in output, as one thread is trying to read or write, it cannot do it while other thread is performing it's task.
Requirement :- At any given point of time only 4 threads should be calling four different functions. As soon as these threads complete, next available thread should call the same functions.
Current code :- This seems to be the worst possible way to achieve something like this. While(True) will cause unnecessary CPU spikes and i could see CPU rising to 70% when running the following code.
Question :- How can i use AutoResetEventHandler to signal Main thread Process() function to start next 4 threads again once the first 4 worker threads are done processing without wasting CPU cycles. Please suggest
public class Demo
{
object protect = new object();
private int counter;
public void Process()
{
int maxthread = 4;
while (true)
{
if (counter <= maxthread)
{
counter++;
Thread t = new Thread(new ThreadStart(DoSomething));
t.Start();
}
}
}
private void DoSomething()
{
try
{
Thread.Sleep(50000); //simulate long running process
}
finally
{
lock (protect)
{
counter--;
}
}
}
You can use TPL to achieve what you want in a simpler way. If you run the code below you'll notice that an entry is written after each thread terminates and only after all four threads terminate the "Finished batch" entry is written.
This sample uses the Task.WaitAll to wait for the completion of all tasks. The code uses an infinite loop for illustration purposes only, you should calculate the hasPendingWork condition based on your requirements so that you only start a new batch of tasks if required.
For example:
private static void Main(string[] args)
{
bool hasPendingWork = true;
do
{
var tasks = InitiateTasks();
Task.WaitAll(tasks);
Console.WriteLine("Finished batch...");
} while (hasPendingWork);
}
private static Task[] InitiateTasks()
{
var tasks = new Task[4];
for (int i = 0; i < tasks.Length; i++)
{
int wait = 1000*i;
tasks[i] = Task.Factory.StartNew(() =>
{
Thread.Sleep(wait);
Console.WriteLine("Finished waiting: {0}", wait);
});
}
return tasks;
}
One other thing, from the textual requirement section on your question I'm lead to believe that a batch of four new threads should only start after all previously four threads completed. However the code you posted is not compatible with that requirement, since it starts a new thread immediately after a previous thread terminate. You should clarify what exactly is your requirement.
UPDATE:
If you want to start a thread immediately after one of the four threads terminate you can still use TPL instead of starting new threads explicitly but you can limit the number of running threads to four by using a SemaphoreSlim. For example:
private static SemaphoreSlim TaskController = new SemaphoreSlim(4);
private static void Main(string[] args)
{
var random = new Random(570);
while (true)
{
// Blocks thread without wasting CPU
// if the number of resources (4) is exhausted
TaskController.Wait();
Task.Factory.StartNew(() =>
{
Console.WriteLine("Started");
Thread.Sleep(random.Next(1000, 3000));
Console.WriteLine("Completed");
// Releases a resource meaning TaskController.Wait will unblock
TaskController.Release();
});
}
}
I'm using parallel.for to launch in many threads a external program. But despite the fact that these are separate threads I need implement sth like delay. E.g. 2 threads want to launch this external program at the same moment - then one of them should wait and start e.g. 10 sec after second thread.
Is it possible?
It's possible, but given the information you've provided it seems pointless... you're enforcing single-threaded execution of the external program, so you might as well have a single thread executing it. If Thread 2 has to wait for Thread 1 in order to start the "external program," then just let Thread 1 do all the work since it already knows when it started the "external program."
The only benefit you will get from a multi-threaded approach is if you have a bunch of processing that you need to do prior to executing the "external program" and that processing has to be a good candidate for concurrent execution.
Update
OK, there are a couple of ways to do this with only one extra thread in order to keep your Main/GUI thread responsive. The first approach is a simple lock around the external resource which you're interacting with:
public class ExternalResourceHandler
{
private readonly ExternalResource _resource;
private readonly object _sync = new object();
// constructors
// ...
// other methods
public void PerformExternalOperation()
{
lock(_sync)
{
Result result = _resource.Execute();
// do soemthing with the result
}
}
}
Here are 3 multi-threaded version for executing the code:
Using a Parallel.For method: recommended if the external program takes a short amount of time to execute- I'd suggest for things under 25 seconds (although this is not necessarily a "correct" number).
Using a ThreadPool: again, I'd recommend for things that take less than 25 seconds (with the same reservation as above).
Using a Thread: this would be recommended if the operation runs longer (i.e. more than 25 seconds, but it would do just as good if it's under 25 seconds).
Here are some examples (not necessarily functional, mostly meant to give you an idea of the different approaches):
public class Program
{
public static ExternalResourceHandler _erh = new ExternalResourceHandler();
static int Main()
{
Console.WriteLine("Type 'exit' to stop; 'parallel', 'pool' or 'thread' for the corresponding execution version.");
string input = Console.ReadLine();
while(input != "exit")
{
switch(input)
{
case "parallel":
// Run the Parallel.For version
ParallelForVersion();
break;
caase "pool":
// Run the threadpool version
ThreadPoolVersion();
break;
case "thread":
// Run the thread version
ThreadVersion();
break;
default:
break;
}
input = Console.ReadLine();
}
return 0;
}
public static void ParallelForVersion()
{
Parallel.For(0, 1, i =>
{
_erh.PerformExternalOperation();
});
}
public static void ThreadPoolVersion()
{
ThreadPool.QueueUserWorkItem(o=>
{
_erh.PerformExternalOperation();
});
}
public static void ThreadVersion()
{
Thread t = new Thread(()=>
{
_erh.PerformExternalOperation();
});
t.IsBackground = true;
t.Start();
}
}
The other option is to employ the Producer/Consumer design pattern where your ExternalResourceHandler is the consumer and it processes requests to the external resource from a thread-safe queue. Your main thread just places requests on the queue and immediately returns back to work. Here is an example:
public class ExternalResourceHandler
{
private volatile boolean _running;
private readonly ExternalResource _resource;
private readonly BlockingQueue<Request> _requestQueue;
public ExternalResourceHandler( BlockingQueue<Request> requestQueue)
{
_requestQueue = requestQueue;
_running = false;
}
public void QueueRequest(Request request)
{
_requestQueue.Enqueue(request);
}
public void Run()
{
_running = true;
while(_running)
{
Request request = null;
if(_requestQueue.TryDequeue(ref request) && request!=null)
{
_resource.Execute(request);
}
}
}
// methods to stop the handler (i.e. set the _running flag to false)
}
Your main would look like this:
public class Program
{
public static ExternalResourceHandler _erh = new ExternalResourceHandler();
static int Main()
{
Thread erhThread = new Thread(()=>{_erh.Run();});
erhThread.IsBackground = true;
erhThread.Start();
Console.WriteLine("Type 'exit' to stop or press enter to enqueue another request.");
string input = Console.ReadLine();
while(input != "exit")
{
_erh.EnqeueRequest(new Request());
input = Console.ReadLine();
}
// Stops the erh by setting the running flag to false
_erh.Stop();
// You may also need to interrupt the thread in order to
// get it out of a blocking state prior to calling Join()
erhThread.Join();
return 0;
}
}
As you see: in both cases all the work for the external handler is forced on a single thread yet your main thread still remains responsive.
Look at producer-consumer pattern. The first thread produces the information "external progam launched" the second thread consumes it, waits 10 second and then launches the external program.
I am creating a thread A and in that thread creating a new thread B.
So how is the thread hierarchy? Thread B is child of Thread A? Or the threads are created as peers?
I want to abort the parent thread A which in turn kills/aborts its child threads.
How is that possible in C#?
Threads should ideally never be aborted. It simply isn't safe. Consider this as a way of putting down an already sick process. Otherwise, avoid like the plague.
The more correct way of doing this is to have something that the code can periodically check, and itself decide to exit.
An example of stopping threads the polite way:
using System;
using System.Threading;
namespace Treading
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Main program starts");
Thread firstThread = new Thread(A);
ThreadStateMessage messageToA = new ThreadStateMessage(){YouShouldStopNow = false};
firstThread.Start(messageToA);
Thread.Sleep(50); //Let other threads do their thing for 0.05 seconds
Console.WriteLine("Sending stop signal from main program!");
messageToA.YouShouldStopNow = true;
firstThread.Join();
Console.WriteLine("Main program ends - press any key to exit");
Console.Read();//
}
private class ThreadStateMessage
{
public bool YouShouldStopNow = false; //this assignment is not really needed, since default value is false
}
public static void A(object param)
{
ThreadStateMessage myMessage = (ThreadStateMessage)param;
Console.WriteLine("Hello from A");
ThreadStateMessage messageToB = new ThreadStateMessage();
Thread secondThread = new Thread(B);
secondThread.Start(messageToB);
while (!myMessage.YouShouldStopNow)
{
Thread.Sleep(10);
Console.WriteLine("A is still running");
}
Console.WriteLine("Sending stop signal from A!");
messageToB.YouShouldStopNow = true;
secondThread.Join();
Console.WriteLine("Goodbye from A");
}
public static void B(object param)
{
ThreadStateMessage myMessage = (ThreadStateMessage)param;
Console.WriteLine("Hello from B");
while(!myMessage.YouShouldStopNow)
{
Thread.Sleep(10);
Console.WriteLine("B is still running");
}
Console.WriteLine("Goodbye from B");
}
}
}
Using Thread.Abort(); causes an exception to be thrown if your thread is in a waiting state of any kind. This is sort of annoying to handle, since there are quite a number of ways that a thread can be waiting. As others have said, you should generally avoid doing it.
Thread.Abort will do what you want, but it is not recommended to abort thread, better choose is to think a way for finishing threads correctly by Thread synchronization mechanism
Here's yet another way to politely signal a thread to die:
Note that this fashion favors finite state automatons where the slave periodically checks for permission to live, then performs a task if allowed. Tasks are not interrupted and are 'atomic'. This works great with simple loops or with command queues. Also this makes sure the thread doesn't spin 100% cpu by giving the slave thread a rest period, set this one to 0 if you don't want any rest in your slave.
var dieEvent = new AutoResetEvent(false);
int slaveRestPeriod = 20;// let's not hog the CPU with an endless loop
var master = new Thread(() =>
{
doStuffAMasterDoes(); // long running operation
dieEvent.Set(); // kill the slave
});
var slave = new Thread(() =>
{
while (!dieEvent.WaitOne(restPeriod))
{
doStuffASlaveDoes();
}
});
slave.Start();
master.Start();
Threads are created as peers, obtain a handle to Thread A and then call ThreadA.Abort()
to forcefully end it. It's better to check a boolean in the thread and if it evaluates to false exit the thread.
public class MyClass
{
public static Thread ThreadA;
public static Thread ThreadB;
private void RunThings()
{
ThreadA = new Thread(new ThreadStart(ThreadAWork));
ThreadB = new Thread(new ThreadStart(ThreadBWork));
ThreadA.Start();
ThreadB.Start();
}
static void ThreadAWork()
{
// do some stuff
// thread A will close now, all work is done.
}
static void ThreadBWork()
{
// do some stuff
ThreadA.Abort(); // close thread A
// thread B will close now, all work is done.
}
}