Detecting a Thread is already running in C# .net? - c#

I am using following code.
public void runThread(){
if (System.Diagnostics.Process.GetProcessesByName("myThread").Length == 0)
{
Thread t = new Thread(new ThreadStart(go));
t.IsBackground = true;
t.Name = "myThread";
t.Start();
}
else
{
System.Diagnostics.Debug.WriteLine("myThreadis already Running.");
}
}
public void go()
{
//My work goes here
}
I am calling runThread() function many time but i want thread only start when thread is not running. How is it possible?

GetProcessesByName will not look for threads in your application but for processes in your machine. In fact there is no good way to get query for the threads in your own application (a matter aside is writing a debugger).
For what you want you could create a wrapper class for your threads in such way that you could query if they are running. Or keep track of the threads yourself by other means.
You could also consider to have a Lazy<Thread> field that will be initialized when needed, and you can query to see if the thread is till alive. After testing Lazy<Thread> is not a good idea.
Derived from Simon's answer:
private int running;
public void runThread()
{
if (Interlocked.CompareExchange(ref running, 1, 0) == 0)
{
Thread t = new Thread
(
() =>
{
try
{
go();
}
catch
{
//Without the catch any exceptions will be unhandled
//(Maybe that's what you want, maybe not*)
}
finally
{
//Regardless of exceptions, we need this to happen:
running = 0;
}
}
);
t.IsBackground = true;
t.Name = "myThread";
t.Start();
}
else
{
System.Diagnostics.Debug.WriteLine("myThreadis already Running.");
}
}
public void go()
{
//My work goes here
}
*: Gotta catch'em all
Wajid and Segey are right. You could just have a Thread field. Allow me to provide the example:
private Thread _thread;
public void runThread()
{
var thread = _thread;
//Prevent optimization from not using the local variable
Thread.MemoryBarrier();
if
(
thread == null ||
thread.ThreadState == System.Threading.ThreadState.Stopped
)
{
var newThread = new Thread(go);
newThread.IsBackground = true;
newThread.Name = "myThread";
newThread.Start();
//Prevent optimization from setting the field before calling Start
Thread.MemoryBarrier();
_thread = newThread;
}
else
{
System.Diagnostics.Debug.WriteLine("myThreadis already Running.");
}
}
public void go()
{
//My work goes here
}
Note: It is better to use the first alternative (the one derived from Simon's answer) because it is thread-safe. That is, if there are various thread calling the method runThread simultaneously there is no risk of more than one thread being created.

One easy way is that you could have a flag that indicates if it's running or not. You maybe have to use some lock if it's some conflicts.
public static bool isThreadRunning = false;
public void runThread()
{
if (!isThreadRunning)
{
Thread t = new Thread(new ThreadStart(go));
t.IsBackground = true;
t.Name = "myThread";
t.Start();
}
else
{
System.Diagnostics.Debug.WriteLine("myThreadis already Running.");
}
}
public void go()
{
isThreadRunning = true;
//My work goes here
isThreadRunning = false;
}

You can use Thread.IsAlive to check whether prevoius thread is running or not.This is to give the thread status.You can put this check before mythread.Start().

Do you create the thread only in run thread method? If it is so hold it as field of the class that holds runThread method and ask t.IsAlive.

Maybe this can help you
static bool isRunning = false;
public void RunThread(){
if (!isRunning)
{
Thread t = new Thread(()=> { go(); isRunning = true;});
t.IsBackground = true;
t.Name = "myThread";
t.Start();
}
else
{
System.Diagnostics.Debug.WriteLine("myThread is already Running.");
}
}
public void go()
{
//My work goes here
}

Related

Using thread.sleep in lock section C#

I create an example about thread,
I know that use lock could avoid thread suspending at critical section, but I have two questions.
1.Why my program get stuck if I use Thread.Sleep?
In this example, I add sleep to two thread.
Because I wish the console output more slowly, so I can easily see if there's anything wrong.
But if I use Thread.Sleep() then this program will get stuck!
2.What situation should I use Thread.Sleep?
Thanks for your kind response, have a nice day.
class MyThreadExample
{
private static int count1 = 0;
private static int count2 = 0;
Thread t1;
Thread t2;
public MyThreadExample() {
t1 = new Thread(new ThreadStart(increment));
t2 = new Thread(new ThreadStart(checkequal));
}
public static void Main() {
MyThreadExample mt = new MyThreadExample();
mt.t1.Start();
mt.t2.Start();
}
void increment()
{
lock (this)
{
while (true)
{
count1++; count2++;
//Thread.Sleep(0); stuck when use Sleep!
}
}
}
void checkequal()
{
lock (this)
{
while (true)
{
if (count1 == count2)
Console.WriteLine("Synchronize");
else
Console.WriteLine("unSynchronize");
// Thread.Sleep(0);
}
}
}
}
Please take a look at these following codes. Never use lock(this), instead use lock(syncObj) because you have better control over it. Lock only the critical section (ex.: only variable) and dont lock the whole while loop. In method Main, add something to wait at the end "Console.Read()", otherwise, your application is dead. This one works with or without Thread.Sleep. In your code above, your thread will enter "Increment" or "Checkequal" and the lock will never release. Thats why, it works only on Increment or Checkequal and never both.
internal class MyThreadExample
{
private static int m_Count1;
private static int m_Count2;
private readonly object m_SyncObj = new object();
private readonly Thread m_T1;
private readonly Thread m_T2;
public MyThreadExample()
{
m_T1 = new Thread(Increment) {IsBackground = true};
m_T2 = new Thread(Checkequal) {IsBackground = true};
}
public static void Main()
{
var mt = new MyThreadExample();
mt.m_T1.Start();
mt.m_T2.Start();
Console.Read();
}
private void Increment()
{
while (true)
{
lock (m_SyncObj)
{
m_Count1++;
m_Count2++;
}
Thread.Sleep(1000); //stuck when use Sleep!
}
}
private void Checkequal()
{
while (true)
{
lock (m_SyncObj)
{
Console.WriteLine(m_Count1 == m_Count2 ? "Synchronize" : "unSynchronize");
}
Thread.Sleep(1000);
}
}
}
Thread is a little bit old style. If you are a beginner of .NET and using .NET 4.5 or above, then use Task. Much better. All new multithreading in .NET are based on Task, like async await:
public static void Main()
{
var mt = new MyThreadExample();
Task.Run(() => { mt.Increment(); });
Task.Run(() => { mt.Checkequal(); });
Console.Read();
}

How close the Thread opening after it ends?

I wonder how to abort my Thread after my function ends Thread.Abort();
My application running files and each file is opened is different thread
int _counter;
int _parallelThreads
_queue = new Queue();
public void transmit()
{
while (_counter < _parallelThreads)
{
lock (_queue)
{
string file = (string)_queue.Dequeue();
ThreadStart ts = delegate { processFile(file); };
Thread thread = new Thread(ts);
thread.IsBackground = true;
thread.Start();
_counter++;
}
}
}
private void processFile(string file)
{
WiresharkFile wf = new WiresharkFile(file, _selectedOutputDevice, 1);
wf.OnFinishPlayEvent += wf_OnFinishPlayEvent;
wf.sendBuffer();
}
and this is the event that my file finished
private void wf_OnFinishPlayEvent(MyClass class)
{
// here i want to abort my thread
}
The reason i want to abort my thread when it finished is because i think this is my memory lack reason in case i open a lot of parallels thread and run it over ond over (my application memory usage read more than 1 giga)
when you abort a thread, a lot of unexpected things can go wrong. particularly when you work with files. when i had to do that (for example, a "cancel" button) i used a litlle trick.
i had a flag IsCanceled on a scope both threads can see be set to true, and on the worker thread, every few statement, will check that flag and close all open files and end itself.
this might not work well for your situation, depending on wf.sendBuffer(); logic. let me know
Example:
private void processFile(string file)
{
WiresharkFile wf = new WiresharkFile(file, _selectedOutputDevice, 1);
wf.OnFinishPlayEvent += wf_OnFinishPlayEvent;
if(IsCanceled == false)
{
wf.sendBuffer();
}
}
and if the sendBuffer() method logic is too long, then
public void sendBuffer()
{
// some logic
if(IsCanceled)
{
// close open streams
return;
}
// some logic
}
as for the flag itself, a singleton class could do just fine for that, or a class all the other classes know
public class Singleton
{
private static Singleton instance;
private bool isCanceled;
private Singleton()
{
isCanceled = false;
}
public static Singleton Instance
{
get
{
if (instance == null)
{
instance = new Singleton();
}
return instance;
}
}
public bool IsCanceled
{
get
{
return isCanceled;
}
set
{
isCanceled = value;
}
}
}
notice that the singleton class is open to everyone, and you might want to use a class only known by the threads that needs to check it. that is something that depend on your security needs.
You should not abort the threads, threads will quit automatically when the code in it finishes. Maybe you just want to wait the thread to finish, after that do something else.
You can use an array to store the thread, and use Thread.Join() to wait all the threads end.
List<Thread> threadList = new List<Thread>();
public void transmit()
{
while (_counter < _parallelThreads)
{
lock (_queue)
{
string file = (string)_queue.Dequeue();
ThreadStart ts = delegate { processFile(file); };
Thread thread = new Thread(ts);
thread.IsBackground = true;
threadList.Add(thread); //add thread to list
thread.Start();
_counter++;
}
}
//wait threads to end
foreach(Thread t in threadList)
t.Join();
}
private void processFile(string file)
{
WiresharkFile wf = new WiresharkFile(file, _selectedOutputDevice, 1);
wf.OnFinishPlayEvent += wf_OnFinishPlayEvent;
wf.sendBuffer();
}

Synchronization across threads / atomic checks?

I need to create an method invoker that any thread (Thread B for example sake) can call, which will execute on the main executing thread (Thead A) at a specific given point in its execution.
Example usage would be as follows:
static Invoker Invoker = new Invoker();
static void ThreadA()
{
new Thread(ThreadB).Start();
Thread.Sleep(...); // Hypothetic Alpha
Invoker.Invoke(delegate { Console.WriteLine("Action"); }, true);
Console.WriteLine("Done");
Console.ReadLine();
}
static void ThreadB()
{
Thread.Sleep(...); // Hypothetic Beta
Invoker.Execute();
}
The Invoker class looks like this:
public class Invoker
{
private Queue<Action> Actions { get; set; }
public Invoker()
{
this.Actions = new Queue<Action>();
}
public void Execute()
{
while (this.Actions.Count > 0)
{
this.Actions.Dequeue()();
}
}
public void Invoke(Action action, bool block = true)
{
ManualResetEvent done = new ManualResetEvent(!block);
this.Actions.Enqueue(delegate
{
action();
if (block) done.Set();
});
if (block)
{
done.WaitOne();
}
}
}
This works fine in most cases, although it won't if, for any reason, the execution (and therefore the Set) is done before the WaitOne, in which case it will just freeze (it allows for the thread to proceed, then blocks). That could be reproduced if Alpha >> Beta.
I can use booleans and whatnot, but I'm never getting a real atomic safety here. I tried some fixes, but they wouldn't work in the case where Beta >> Alpha.
I also thought of locking around both the Invoker.Execute and Invoker.Invoke methods so that we are guaranteed that the execution does not occur between enqueing and waiting. However, the problem is that the lock also englobes the WaitOne, and therefore never finishes (deadlock).
How should I go about getting absolute atomic safety in this paradigm?
Note: It really is a requirement that I work with this design, from external dependencies. So changing design is not a real option.
EDIT: I did forget to mention that I want a blocking behaviour (based on bool block) until the delegate is executed on the Invoke call.
Use a Semaphore(Slim) instead of the ManualResetEvent.
Create a semaphore with an maximum count of 1, call WaitOne() in the calling thread, and call Release() in the delegate.
If you've already called Release(), WaitOne() should return immediately.
Make sure to Dispose() it when you're done, preferably in a using block.
If block is false, you shouldn't create it in the first place (although for SemaphoreSlim, that's not so bad).
You can use my technique:
public void BlockingInvoke(Action action)
{
volatile bool isCompleted = false;
volatile bool isWaiting = false;
ManualResetEventSlim waiter = new ManualResetEventSlim();
this.Actions.Enqueue(delegate
{
action();
isCompleted = true;
Thread.MemoryBarrier();
if (!isWaiting)
waiter.Dispose();
else
waiter.Set();
});
isWaiting = true;
Thread.MemoryBarrier();
if (!isCompleted)
waiter.Wait();
waiter.Dispose();
}
Untested
I'm answering only to show the implementation SLaks described and my solution to ensure proper and unique disposal with locks. It's open to improvement and criticism, but it actually works.
public class Invoker
{
private Queue<Action> Actions { get; set; }
public Invoker()
{
this.Actions = new Queue<Action>();
}
public void Execute()
{
while (this.Actions.Count > 0)
{
this.Actions.Dequeue()();
}
}
public void Invoke(Action action, bool block = true)
{
if (block)
{
SemaphoreSlim semaphore = new SemaphoreSlim(1);
bool disposed = false;
this.Actions.Enqueue(delegate
{
action();
semaphore.Release();
lock (semaphore)
{
semaphore.Dispose();
disposed = true;
}
});
lock (semaphore)
{
if (!disposed)
{
semaphore.Wait();
semaphore.Dispose();
}
}
}
else
{
this.Actions.Enqueue(action);
}
}
}

C# Thread Queue Synchronize

Greetings, I am trying to play some audio files without holding up the GUI. Below is a sample of the code:
if (audio)
{
if (ThreadPool.QueueUserWorkItem(new WaitCallback(CoordinateProc), fireResult))
{
}
else
{
MessageBox.Show("false");
}
}
if (audio)
{
if (ThreadPool.QueueUserWorkItem(new WaitCallback(FireProc), fireResult))
{
}
else
{
MessageBox.Show("false");
}
}
if (audio)
{
if (ThreadPool.QueueUserWorkItem(new WaitCallback(HitProc), fireResult))
{
}
else
{
MessageBox.Show("false");
}
}
The situation is the samples are not being played in order. some play before the other and I need to fix this so the samples are played one after another in order.
How do I implement this please?
Thank you.
EDIT: ThreadPool.QueueUserWorkItem(new WaitCallback(FireAttackProc), fireResult);
I have placed all my sound clips in FireAttackProc. What this does not do and I want is: wait until the thread stops running before starting a new thread so the samples dont overlap.
Why not just create one "WorkItem" and do everything there?
You can't guarrantee the order of execution of thread pool threads. Rather than that, as suggested by others, use a single thread to run the procs in order. Add the audio procs to a queue, run a single thread that pulls each proc off the queue in order and calls them. Use an event wait handle to signal the thread each time a proc is added to the queue.
An example (this doesn't completely implement the Dispose pattern... but you get the idea):
public class ConcurrentAudio : IDisposable
{
public ConcurrentAudio()
{
_queue = new ConcurrentQueue<WaitCallback>();
_waitHandle = new AutoResetEvent(false);
_disposed = false;
_thread = new Thread(RunAudioProcs);
_thread.IsBackground = true;
_thread.Name = "run-audio";
_thread.Start(null); // pass whatever "state" you need
}
public void AddAudio(WaitCallback proc)
{
_queue.Enqueue(proc);
_waitHandle.Set();
}
public void Dispose()
{
_disposed = true;
_thread.Join(1000); // don't feel like waiting forever
GC.SuppressFinalize(this);
}
private void RunAudioProcs(object state)
{
while (!_disposed)
{
try
{
WaitCallback proc = null;
if (_queue.TryDequeue(out proc))
proc(state);
else
_waitHandle.WaitOne();
}
catch (Exception x)
{
// Do something about the error...
Trace.WriteLine(string.Format("Error: {0}", x.Message), "error");
}
}
}
private ConcurrentQueue<WaitCallback> _queue;
private EventWaitHandle _waitHandle;
private bool _disposed;
private Thread _thread;
}
You should have a look at the BackgroundWorker option !

Synchronization issues: everything seems correct, but

I wrote a multithreaded application for .NET and in a very important portion of code I have the following:
public class ContainerClass {
private object list_lock;
private ArrayList list;
private object init_lock = new object();
private ThreadClass thread;
public void Start() {
lock(init_lock) {
if (thread == null) {
thread = new ThreadClass();
...
}
}
}
public void Stop() {
lock(init_lock) {
if (thread != null) {
thread.processList(0);
thread.finish();
thread.waitUntilFinished();
thread = null;
} else {
throw new ApplicationException("Assertion failed - already stopped.");
}
...
}
}
private class ThreadedClass {
private ContainerClass container;
private Thread thread;
private bool finished;
private bool actually_finished;
public ThreadedClass(ContainerClass container) {
this.container = container;
thread = new Thread(run);
thread.IsBackground = true;
thread.Start();
}
private void run() {
bool local_finished = false;
while (!local_finished) {
ArrayList to_process = null;
lock (container.list_lock) {
if (container.list.Count > 0) {
to_process = new ArrayList();
to_process.AddRange(container.list);
}
}
if (to_process == null) {
// Nothing to process so wait
lock (this) {
if (!finished) {
try {
Monitor.Wait(this);
} catch (ThreadInterruptedException) {
}
}
}
} else if (to_process.Count > 0) {
// Something to process, so go ahead and process the journals,
int sz = to_process.Count;
// For all elements
for (int i = 0; i < sz; ++i) {
// Pick the lowest element to process
object obj = to_process[i];
try {
// process the element...
...
} catch (IOException e) {
...
// If there is an error processing the best thing to do is finish
lock (this) {
finished = true;
}
}
}
}
lock (this) {
local_finished = finished;
// Remove the elements that we have just processed.
if (to_process != null) {
lock (container.list_lock) {
int sz = to_process.Count;
for (int i = 0; i < sz; ++i) {
container.list.RemoveAt(0);
}
}
}
// Notify any threads waiting
Monitor.PulseAll(this);
}
}
lock (this) {
actually_finished = true;
Monitor.PulseAll(this);
}
}
public void waitUntilFinished() {
lock (this) {
try {
while (!actually_finished) {
Monitor.Wait(this);
}
} catch (ThreadInterruptedException e) {
throw new ApplicationException("Interrupted: " + e.Message);
}
}
}
public void processList(int until_size) {
lock (this) {
Monitor.PulseAll(this);
int sz;
lock (container.list_lock) {
sz = container.list.Count;
}
// Wait until the sz is smaller than 'until_size'
while (sz > until_size) {
try {
Monitor.Wait(this);
} catch (ThreadInterruptedException ) {
}
lock (container.list_lock) {
sz = container.list.Count;
}
}
}
}
}
}
As you can see, the thread waits until the collection is empty but it seems that the synchronization clashes forbids the thread to enter at the point (the only one in the whole code) where an element is removed from the collection list in the ContainerClass.
This clash provokes the code to never return and the application to continue running if the method processList is called with the value of until_size of 0.
I beg any better developer than me (and I guess there are a lot out there) to help me fixing this small piece of code, since I really can't understand why the list isn't decremented...
Thank you very much from the bottom of my heart.
PS. I would like to underline that the code works perfectly for all the time: the only situation in which it brakes it's when calling thread.processList(0) from ContainerClass.Stop().
Could the problem be that you are locking the ThreadClass object itself rather than a synchronizing object?
Try adding another private variable to lock on:
private static readonly object lockObject = new object()
and replace all the calls of lock(this) with lock(lockObject)
MSDN clearly advises against what your doing:
In general, avoid locking on a public
type, or instances beyond your code's
control. The common constructs lock
(this), lock (typeof (MyType)), and
lock ("myLock") violate this
guideline:
lock (this) is a problem if the instance can be accessed publicly.
Edit:
I think I see a deadlock condition. If you call run() when there are no objects to process, or you get to no objects to process, you lock(this), then call Monitor.Wait(this) and the thread waits:
if (to_process == null) {
// Nothing to process so wait
lock (this) { /* nothing's going to get this lock again until Monitor.PulseAll(this) is called from somewhere */
if (!finished) {
try {
Monitor.Wait(this); /* thread is waiting for Pulse(this) or PulseAll(this) */
} catch (ThreadInterruptedException) {
}
}
}
}
If you are in this condition when you call Container.Stop(), when ThreadProcess.processList(int) is called, you call lock(this) again, which can't enter the section because the run() method still has the lock:
lock (this) { /* run still holds this lock, waiting for PulseAll(this) to be called */
Monitor.PulseAll(this); /* this isn't called so run() never continues */
int sz;
lock (container.list_lock) {
sz = container.list.Count;
}
So, Monitor.PulseAll() can't be called to free the waiting thread in the run() method to exit the lock(this) area, so they are deadlocked waiting on each other. Right?
I think you should try to explain better what you actually want to achieve.
public void processList(int until_size) {
lock (this) {
Monitor.PulseAll(this);
This looks very strange as you should call the Monitor.Pulse when changing the lock state and not when beginning with locking.
Where are you creating the worker threads - this section is not clear as I see only Thread.Start()?
Btw I would advise you to look at PowerCollections - maybe you find what you need there.

Categories

Resources