I have some code that looks like the below. Does this create a deadlock?
private readonly object objectLock = new object();
public void MethodA()
{
lock(objectLock)
{
MethodB();
}
}
public void MethodB()
{
lock(objectLock)
{
//do something
}
}
UPDATE: There will 2 threads running
No - but this would be:
private readonly object objectLockA = new object();
private readonly object objectLockB = new object();
public void MethodA()
{
lock(objectLockA)
{
lock(objectLockB)
{
//...
}
}
}
public void MethodB()
{
lock(objectLockB)
{
lock(objectLockA)
{
//do something
}
}
}
If you call both Methods in parallel (from 2 different threads) then you would get a deadlock...
No its not a deadlock. Its the same thread locking on the same synchronization object. A thread can take nested locks. It just needs to release it equal no. of times.
No, you'd need two lock objects to enable a deadlock.
If this is the only mutex involved, it isn't. The same thread can lock the same mutex multiple times, as long it unlocks it equal number of times.
Calling MethodA produces the following sequence of operations on the same thread:
Lock objectLock.
Call MethodB.
Lock objectLock.
Unlock objectLock.
Exit MethodB.
Unlock objectLock.
So, objectLock is locked twice and unlocaked twice, but there is no deadlock:
If another thread tries to call MethodA it will simply block on the first lock but will not deadlock.
It if calls MethodB, the same would happen.
And if first thread call MethodB and then other thread calls MethodA, again "normal" blocking would take place but not a deadlock.
If you copy paste the following lines, compile and run see that "never called" is not printed in the console.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace deadlocktest
{
class Program
{
static object object1 = new object();
static object object2 = new object();
public static void FunctionOne()
{
lock (object1)
{
Console.WriteLine("FunctionOne called 1");
Thread.Sleep(1000);
lock (object2)
{
Console.WriteLine("FunctionOne called 2, never called");
}
}
}
public static void FunctionTwo()
{
lock (object2)
{
Console.WriteLine("FunctionTwo called 1");
Thread.Sleep(1000);
lock (object1)
{
Console.WriteLine("FunctionTwo called 2, never called");
}
}
}
static void Main(string[] args)
{
Thread thread1 = new Thread(FunctionOne);
Thread thread2 = new Thread(FunctionTwo);
thread1.Start();
thread2.Start();
int i = 0;
while (i < 9)
{
Console.WriteLine("How bad thread!");
i++;
}
thread1.Join();
thread2.Join();
Console.ReadLine();
}
}
}
Related
I am creating a custom CountdownWaitHandle class it has the following method:
public void Signal()
{
if (Interlocked.Decrement(ref threadsInstances) <= 0)
{
mre.Set();
}
}
mre is a instance of ManualResetEvent class and I use this class to block the current thread and wait to all threads complete his work and each thread when finish his work or occurred an exception call Signal() method.
So my question if the return value of Interlock.Decrement and condition(<=0) could cause any Concurrency Issue inside if condition? or I have to use a lock statement for the if condition and if body instead of Interlock like as example above:
lock(_lock)
{
if (--threadsInstances <= 0)
{
mre.Set();
}
}
Note:I am using 3.5 net.
Complete code:
public class CountdownWaitHandle : WaitHandle
{
private int threadsInstances = 0;
private ManualResetEvent mre;
private readonly object threadsyncAccess = new object();
public CountdownWaitHandle(int initialCount)
{
threadsInstances = initialCount;
mre = new ManualResetEvent(false);
}
public void AddCount()
{
Interlocked.Increment(ref threadsInstances);
}
public void Signal()
{
if (Interlocked.Decrement(ref threadsInstances) <= 0)
{
mre.Set();
}
}
public override bool WaitOne()
{
return mre.WaitOne();
}
}
In this example.
I am going to use my custom CountdownEvent class to download a large
file using chunks for any Cloud Site. So each chunk after finish downloading his range bytes it release the resources or move to another Stream.
public static void Main(String[] args)
{
CountdownWaitHandle customCountDown = new CountdownWaitHandle(0)
while (i < 100)
{
SpecificWork work1 = new SpecificWork (startPosition, endPosition, customCountDown);
customCountDown.AddCount();
ThreadPool.QueueUserWorkItem(PerformTask, work1); // after finish download it invokes to Signal method.
}
customCountDown.WaitOne();
}
Interlocked.Decrement will work as intended in this sample, assuming you are calling Interlocked.Increment to raise the count above zero.
Of course, using CountdownEvent would be better than building your own synchronization object.
I create an example about thread,
I know that use lock could avoid thread suspending at critical section, but I have two questions.
1.Why my program get stuck if I use Thread.Sleep?
In this example, I add sleep to two thread.
Because I wish the console output more slowly, so I can easily see if there's anything wrong.
But if I use Thread.Sleep() then this program will get stuck!
2.What situation should I use Thread.Sleep?
Thanks for your kind response, have a nice day.
class MyThreadExample
{
private static int count1 = 0;
private static int count2 = 0;
Thread t1;
Thread t2;
public MyThreadExample() {
t1 = new Thread(new ThreadStart(increment));
t2 = new Thread(new ThreadStart(checkequal));
}
public static void Main() {
MyThreadExample mt = new MyThreadExample();
mt.t1.Start();
mt.t2.Start();
}
void increment()
{
lock (this)
{
while (true)
{
count1++; count2++;
//Thread.Sleep(0); stuck when use Sleep!
}
}
}
void checkequal()
{
lock (this)
{
while (true)
{
if (count1 == count2)
Console.WriteLine("Synchronize");
else
Console.WriteLine("unSynchronize");
// Thread.Sleep(0);
}
}
}
}
Please take a look at these following codes. Never use lock(this), instead use lock(syncObj) because you have better control over it. Lock only the critical section (ex.: only variable) and dont lock the whole while loop. In method Main, add something to wait at the end "Console.Read()", otherwise, your application is dead. This one works with or without Thread.Sleep. In your code above, your thread will enter "Increment" or "Checkequal" and the lock will never release. Thats why, it works only on Increment or Checkequal and never both.
internal class MyThreadExample
{
private static int m_Count1;
private static int m_Count2;
private readonly object m_SyncObj = new object();
private readonly Thread m_T1;
private readonly Thread m_T2;
public MyThreadExample()
{
m_T1 = new Thread(Increment) {IsBackground = true};
m_T2 = new Thread(Checkequal) {IsBackground = true};
}
public static void Main()
{
var mt = new MyThreadExample();
mt.m_T1.Start();
mt.m_T2.Start();
Console.Read();
}
private void Increment()
{
while (true)
{
lock (m_SyncObj)
{
m_Count1++;
m_Count2++;
}
Thread.Sleep(1000); //stuck when use Sleep!
}
}
private void Checkequal()
{
while (true)
{
lock (m_SyncObj)
{
Console.WriteLine(m_Count1 == m_Count2 ? "Synchronize" : "unSynchronize");
}
Thread.Sleep(1000);
}
}
}
Thread is a little bit old style. If you are a beginner of .NET and using .NET 4.5 or above, then use Task. Much better. All new multithreading in .NET are based on Task, like async await:
public static void Main()
{
var mt = new MyThreadExample();
Task.Run(() => { mt.Increment(); });
Task.Run(() => { mt.Checkequal(); });
Console.Read();
}
Consider the following pattern:
private AutoResetEvent signal = new AutoResetEvent(false);
private void Work()
{
while (true)
{
Thread.Sleep(5000);
signal.Set();
//has a waiting thread definitely been signaled by now?
signal.Reset();
}
}
public void WaitForNextEvent()
{
signal.WaitOne();
}
The purpose of this pattern is to allow external consumers to wait for a certain event (e.g. - a message arriving). WaitForNextEvent is not called from within the class.
To give an example that should be familiar, consider System.Diagnostics.Process. It exposes an Exited event, but it also exposes a WaitForExit method, which allows the caller to wait synchronously until the process exits. this is what I am trying to achieve here.
The reason I need signal.Reset() is that if a thread calls WaitForNextEvent after signal.Set() has already been called (or in other words, if .Set was called when no threads were waiting), it returns immediately, as the event has already been previously signaled.
The question
Is it guaranteed that a thread calling WaitForNextEvent() will be signaled before signal.Reset() is called? If not, what are other solutions for implementing a WaitFor method?
Instead of using AutoResetEvent or ManualResetEvent, use this:
public sealed class Signaller
{
public void PulseAll()
{
lock (_lock)
{
Monitor.PulseAll(_lock);
}
}
public void Pulse()
{
lock (_lock)
{
Monitor.Pulse(_lock);
}
}
public void Wait()
{
Wait(Timeout.Infinite);
}
public bool Wait(int timeoutMilliseconds)
{
lock (_lock)
{
return Monitor.Wait(_lock, timeoutMilliseconds);
}
}
private readonly object _lock = new object();
}
Then change your code like so:
private Signaller signal = new Signaller();
private void Work()
{
while (true)
{
Thread.Sleep(5000);
signal.Pulse(); // Or signal.PulseAll() to signal ALL waiting threads.
}
}
public void WaitForNextEvent()
{
signal.Wait();
}
There is no guarantee. This:
AutoResetEvent flag = new AutoResetEvent(false);
new Thread(() =>
{
Thread.CurrentThread.Priority = ThreadPriority.Lowest;
Console.WriteLine("Work Item Started");
flag.WaitOne();
Console.WriteLine("Work Item Executed");
}).Start();
// For fast systems, you can help by occupying processors.
for (int ix = 0; ix < 2; ++ix)
{
new Thread(() => { while (true) ; }).Start();
}
Thread.Sleep(1000);
Console.WriteLine("Sleeped");
flag.Set();
// Decomment here to make it work
//Thread.Sleep(1000);
flag.Reset();
Console.WriteLine("Finished");
Console.ReadLine();
won't print "Work Item Executed" on my system. If I add a Thread.Sleep between the Set and the Reset it prints it. Note that this is very processor dependent, so you could have to create tons of threads to "fill" the CPUs. On my PC it's reproducible 50% of the times :-)
For the Exited:
readonly object mylock = new object();
then somewhere:
lock (mylock)
{
// Your code goes here
}
and the WaitForExit:
void WaitForExit()
{
lock (mylock) ;
// exited
}
void bool IsExited()
{
bool lockTacken = false;
try
{
Monitor.TryEnter(mylock, ref lockTacken);
}
finally
{
if (lockTacken)
{
Monitor.Exit(mylock);
}
}
return lockTacken;
}
Note that the lock construct isn't compatible with async/await (as aren't nearly all the locking primitives of .NET)
I would use TaskCompletionSources:
private volatile TaskCompletionSource<int> signal = new TaskCompletionSource<int>();
private void Work()
{
while (true)
{
Thread.Sleep(5000);
var oldSignal = signal;
signal = new TaskCompletionSource<int>()
//has a waiting thread definitely been signaled by now?
oldSignal.SetResult(0);
}
}
public void WaitForNextEvent()
{
signal.Task.Wait();
}
By the time that the code calls SetResult, no new code entering WaitForNextEvent can obtain the TaskCompletionSource that is being signalled.
I believe it is not guaranteed.
However, your logic flow is not understood by me. If your main thread Sets the signal, why should it wait until that signal reaches its destination? Wouldn't it be better to continue your "after signal set" logic in that thread which was waiting?
If you cannot do that, I recommend you to use second WaitHandle to signal the first thread that the second one has reveiced the signal. But I cannot see any pros of such a strategy.
I've read several articles and posts that say that lock(this), lock(typeof(MyType)), lock("a string") are all bad practice because another thread could lock on the same key and cause a deadlock. In order to understand this problem, I was trying to create some sample code to illustrate the deadlock but have been unable to wrap my head around this.
Can someone write a concise bit of code that illustrates this classic problem? Please keep it short, I can digest code in smaller chunks only.
Edit:
I think lassevk sums it up well; that the real problem is that you have lost control over your locks. Once that happens, you cannot control the order the locks are called, and you are allowing a potential deadlock situation.
lock(this), lock(typeof(MyType)), etc all are situations where you have chosen a lock that is impossible to control.
A deadlock will only occur if you have more than one lock. You need a situation where both threads hold a resource that the other needs (which means there has to be a least two resources, and the two threads have to attempt to acquire them in a different order)
So a simple example:
// thread 1
lock(typeof(int)) {
Thread.Sleep(1000);
lock(typeof(float)) {
Console.WriteLine("Thread 1 got both locks");
}
}
// thread 2
lock(typeof(float)) {
Thread.Sleep(1000);
lock(typeof(int)) {
Console.WriteLine("Thread 2 got both locks");
}
}
Assuming both threads are started within a second of each others, they will both have time to grab the first lock before anyone gets to the inner lock. Without the Sleep() call, one of the threads would most likely have time to get and release both locks before the other thread even got started.
The idea is that you should never lock on something you cannot control who has access to.
Type objects are singletons visible to every .net piece of code and you cannot control who locks on your "this" object from the outside.
Same thing is for strings: since strings are immutable, the framework keeps just one instance of "hard coded" strings and puts them in a pool (the string is said to be interned), if you write two times in your code the string "hello", you will always get the same abject.
Consider the following example: you wrote just Thread1 in your super private call, while Thread2 is called by some library you are using in a background thread...
void Thread1()
{
lock (typeof(int))
{
Thread.Sleep(1000);
lock (typeof(long))
// do something
}
}
void Thread2()
{
lock (typeof(long))
{
Thread.Sleep(1000);
lock (typeof(int))
// do something
}
}
Sure, here you go.
Note that the common example for a deadlock is when you acquire multiple locks, and two or more threads end up waiting for each other.
For instance, two threads that locks like this:
Thread 1 Thread 2
Lock "A" Lock "B"
Lock "B" Lock "A" <-- both threads will stop dead here
waiting for the lock to be come
available.
However, in this example I didn't bother with that, I just let one thread lock indefinitely. You really don't want to loose control over your locks, so while this is a contrived example, the fact that the background thread can completely block the main thread like this, is bad.
using System;
using System.Threading;
namespace ConsoleApplication7
{
public class Program
{
public static void Main(string[] args)
{
LockableClass lockable = new LockableClass();
new Thread(new ParameterizedThreadStart(BackgroundMethod)).Start(lockable);
Thread.Sleep(500);
Console.Out.WriteLine("calling Reset");
lockable.Reset();
}
private static void BackgroundMethod(Object lockable)
{
lock (lockable)
{
Console.Out.WriteLine("background thread got lock now");
Thread.Sleep(Timeout.Infinite);
}
}
}
public class LockableClass
{
public Int32 Value1 { get; set; }
public Int32 Value2 { get; set; }
public void Reset()
{
Console.Out.WriteLine("attempting to lock on object");
lock (this)
{
Console.Out.WriteLine("main thread got lock now");
Value1 = 0;
Value2 = 0;
}
}
}
}
This is pretty standard bad-ness. Grabing the locks out of order and then sleeping with the lock. Two bad things to do. :)
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
namespace DeadLock
{
public class Program
{
static void Main(string[] args)
{
var ddt = new DontDoThat();
ddt.Go();
}
}
public class DontDoThat
{
private int _badSharedState = 0;
private readonly object _lock1 = new object();
private readonly object _lock2 = new object();
public void Go()
{
new Thread(BadGuy1).Start();
new Thread(BadGuy2).Start();
Console.WriteLine("Leaving Go!");
}
public void BadGuy1()
{
lock (_lock1)
{
Thread.Sleep(100); // yeild with the lock is bad
lock (_lock2)
{
_badSharedState++;
Console.Write("From Bad Guy #1: {0})", _badSharedState );
}
}
}
public void BadGuy2()
{
lock (_lock2)
{
lock (_lock1)
{
_badSharedState++;
Console.Write("From Bad Guy #2: {0})", _badSharedState);
}
}
}
}
}
The problem is that lock("a string") is locking on a singleton. This means that other objects that use the same lock could be an infinite wait.
for example:
using System;
using System.Threading;
namespace ThreadLock
{
class Program
{
static void Main(string[] args)
{
lock ("my lock")
{
ManualResetEvent evt = new ManualResetEvent(false);
WorkerObject worker = new WorkerObject(evt);
Thread t = new Thread(new ThreadStart(worker.Work));
t.Start();
evt.WaitOne();
}
}
}
class WorkerObject
{
private ManualResetEvent _evt;
public WorkerObject(ManualResetEvent evt)
{
_evt = evt;
}
public void Work()
{
lock ("my lock")
{
Console.WriteLine("worked.");
_evt.Set();
}
}
}
}
In this case, the calling code creates a lock on a string then makes a worker object. The worker object in Work() locks on the same string, which is a singleton in C#. It ends up in deadlock because the caller owns the lock and is waiting for a signal which will never come.
class Character
{
public Character Other;
public string Name;
private object locker = new object();
public Character(string name)
{
Name = name;
}
public void Go()
{
lock (locker)
{
Thread.Sleep(1000);
Console.WriteLine("go in {0}", Name);
Other.Go();
}
}
}
class Program
{
static void Main(string[] args)
{
Character a = new Character("A");
Character b = new Character("B");
a.Other = b;
b.Other = a;
new Thread(a.Go).Start();
b.Go();
Console.ReadLine();
}
}
For a "log information for support" type of function I'd like to enumerate and dump active thread information.
I'm well aware of the fact that race conditions can make this information semi-inaccurate, but I'd like to try to get the best possible result, even if it isn't 100% accurate.
I looked at Process.Threads, but it returns ProcessThread objects, I'd like to have a collection of Thread objects, so that I can log their name, and whether they're background threads or not.
Is there such a collection available, even if it is just a snapshot of the active threads when I call it?
ie.
Thread[] activeThreads = ??
Note, to be clear, I am not asking about Process.Threads, this collection gives me a lot, but not all of what I want. I want to know how much time specific named threads in our application is currently using (which means I will have to look at connecting the two types of objects later, but the names is more important than the CPU time to begin with.)
If you're willing to replace your application's Thread creations with another wrapper class, said wrapper class can track the active and inactive Threads for you. Here's a minimal workable shell of such a wrapper:
namespace ThreadTracker
{
using System.Collections.Generic;
using System.Collections.ObjectModel;
using System.Threading;
public class TrackedThread
{
private static readonly IList<Thread> threadList = new List<Thread>();
private readonly Thread thread;
private readonly ParameterizedThreadStart start1;
private readonly ThreadStart start2;
public TrackedThread(ParameterizedThreadStart start)
{
this.start1 = start;
this.thread = new Thread(this.StartThreadParameterized);
lock (threadList)
{
threadList.Add(this.thread);
}
}
public TrackedThread(ThreadStart start)
{
this.start2 = start;
this.thread = new Thread(this.StartThread);
lock (threadList)
{
threadList.Add(this.thread);
}
}
public TrackedThread(ParameterizedThreadStart start, int maxStackSize)
{
this.start1 = start;
this.thread = new Thread(this.StartThreadParameterized, maxStackSize);
lock (threadList)
{
threadList.Add(this.thread);
}
}
public TrackedThread(ThreadStart start, int maxStackSize)
{
this.start2 = start;
this.thread = new Thread(this.StartThread, maxStackSize);
lock (threadList)
{
threadList.Add(this.thread);
}
}
public static int Count
{
get
{
lock (threadList)
{
return threadList.Count;
}
}
}
public static IEnumerable<Thread> ThreadList
{
get
{
lock (threadList)
{
return new ReadOnlyCollection<Thread>(threadList);
}
}
}
// either: (a) expose the thread object itself via a property or,
// (b) expose the other Thread public methods you need to replicate.
// This example uses (a).
public Thread Thread
{
get
{
return this.thread;
}
}
private void StartThreadParameterized(object obj)
{
try
{
this.start1(obj);
}
finally
{
lock (threadList)
{
threadList.Remove(this.thread);
}
}
}
private void StartThread()
{
try
{
this.start2();
}
finally
{
lock (threadList)
{
threadList.Remove(this.thread);
}
}
}
}
}
and a quick test driver of it (note I do not iterate over the list of threads, merely get the count in the list):
namespace ThreadTracker
{
using System;
using System.Threading;
internal static class Program
{
private static void Main()
{
var thread1 = new TrackedThread(DoNothingForFiveSeconds);
var thread2 = new TrackedThread(DoNothingForTenSeconds);
var thread3 = new TrackedThread(DoNothingForSomeTime);
thread1.Thread.Start();
thread2.Thread.Start();
thread3.Thread.Start(15);
while (TrackedThread.Count > 0)
{
Console.WriteLine(TrackedThread.Count);
}
Console.ReadLine();
}
private static void DoNothingForFiveSeconds()
{
Thread.Sleep(5000);
}
private static void DoNothingForTenSeconds()
{
Thread.Sleep(10000);
}
private static void DoNothingForSomeTime(object seconds)
{
Thread.Sleep(1000 * (int)seconds);
}
}
}
Not sure if you can go such a route, but it will accomplish the goal if you're able to incorporate at an early stage of development.
Is it feasible for you to store thread information in a lookup as you create each thread in your application?
As each thread starts, you can get its ID using AppDomain.GetCurrentThreadId(). Later, you can use this to cross reference with the data returned from Process.Threads.