About lock objects in C# - c#

Consider the following code:
static void AddItem()
{
lock (_list)
_list.Add ("Item " + _list.Count); //Lock 1
string[] items;
lock (_list)
items = _list.ToArray(); //Lock 2
foreach (string s in items)
Console.WriteLine (s);
}
If Thread A gets Lock 2, and Thread B attempts to get Lock 1, will B get the lock or not? Considering both locks use the same locking object.

No, thread B will need to wait until thread A releases the lock. That's the point of it being the same lock object, after all - there's one lock. Where the lock is acquired or released is irrelevant: only one thread can "own" the monitor at a time.
I would strongly advise you to use braces for readability, by the way:
lock(_list)
{
_list.Add(...);
}

No, B will not. Both are locking on the same object and therefore the two locks are "linked." For this reason, if you need to highly-optimise such code, there are times where you might consider multiple lock objects.
As a side note, you should not be locking on the list itself but on an object created specifically for that purpose.

No, since they use the same locking object, they are mutually exclusive.
Often code is used to lock an object (for example a list) to perform an operation on it without interference from other threads. This requires that the item is locked no matter what operation is performed.
To elaborate, say you have a list that is designed to be threadsafe. If you try adding and deleting multiple items simultaneously, you could corrupt the list. By locking the list whenever it needs to be modified, you can help ensure thread safety.
This all depends on the fact that one object will make all locks mutually exclusive.

If Thread A is using the lock, then no other thread can use it (regardless of where the lock is being used). So, thread B will be blocked until that lock is free.

Consider that:
lock(obj)
{
//Do Stuff;
}
Is shorthand for:
Monitor.Enter(obj);
try
{
//Do Stuff;
}
finally
{
Monitor.Exit(obj);
}
Now consider that Monitor.Enter() is a method call like any other. It knows nothing about where in the code it was called. The only thing it knows about, is what object was passed to it.
As far as it's concerned, what you call "Lock 1" and "Lock 2" are the same lock.

Related

Conditional thread lock in c#

Is it possible to have a conditional thread lock when the underlying condition is not constant?
I have two functions A and B, and a condition to decide which function to execute.
A is thread safe by itself, multiple calls to A can execute simultaneously, B is not, and is Synchronized. But during execution of B the condition can change (from false to true) and therefore all threads executing A at that time will throw errors.
if (condition)
{
A();
}
else
{
B();
}
A - thread safe
B - Synchronized using [MethodImpl(MethodImplOptions.Synchronized)]
Therefore, I am looking for a way to lock A but only when B is running.
Please suggest a way to achieve this.
Some elaborations:
I am creating a cache, and performance is very crucial, thus a blanket lock is not feasible.
Condition is whether or not the requested data is present in the cache.
A() = AddToUpdates() - Executed on a cache hit, just adds to the number of updates for a particular cache key, using a concurrent dictionary.
B() = ProccessUpdates() and EvictLeastPriorityEntry() - Executed on a cache miss, all previous updates will be processed and the underlying data structure storing the ordering of cache entries will be re-arranged.
And then the entry with least priority will be removed.
As mentioned in the accepted answer ReaderWriterLock seems to be the way to go.
Just one problem though,
Let's say, thread1 starts execution and a cache hit occurs, (on the entry with the least priority) meaning the if condition is true and enters the if block. But before calling A(), control is switched to thread2.
thread2 - cache miss occurs, reordering and eviction (Entry which A() from thread1 needed access to) is performed.
Now when controlled is returned to thread1, error will occur.
This is the solution I feel should work:
_lock.EnterReadLock();
if (condition)
{
A();
}
_lock.ExitReadLock();
if (!condition)
{
B();
}
void A()
{
// ....
}
void B()
{
_lock.EnterWriteLock();
// ...
_lock.ExitWriteLock();
}
Will this work?
Thank you.
I possible solution to your problem might be the ReaderWriterLockSlim class. This is a synchronization primitive that allows multiple concurrent readers, or one exclusive writer, but not both of those at the same time.
Use ReaderWriterLockSlim to protect a resource that is read by multiple threads and written to by one thread at a time. ReaderWriterLockSlim allows multiple threads to be in read mode, allows one thread to be in write mode with exclusive ownership of the lock, and allows one thread that has read access to be in upgradeable read mode, from which the thread can upgrade to write mode without having to relinquish its read access to the resource.
Example:
private readonly ReaderWriterLockSlim _lock = new();
void A()
{
_lock.EnterReadLock();
try
{
//...
}
finally { _lock.ExitReadLock(); }
}
void B()
{
_lock.EnterWriteLock();
try
{
//...
}
finally { _lock.ExitWriteLock(); }
}
Your question looks a lot like this:
A() is some read only method, so thread safe. Different execution of A in parallel is OK.
B() is like writing/mutating things that A method uses. So A() becomes not thread safe if executed at same time.
For example B() could write in a List and A() executions read on this list. And you would get exception "InvalidOperationException: Collection Was Modified" thrown from A() .
I advise you to look for "producer/consumer problem" in google and look for the tons of example there are.
But in case you absolutely want to begins B execution while A execution(s) has/have not terminated, you can add checkpoint in A() using Monitor class, it is used to lock a resource and synchronize with other threads. It is more complex though and i would go first for producer/consumer pattern to see if it fill the needs
Some more things:
I would check is the use of BlockingCollection<T> class that may fit your exact need too (and is easy to use)
The use of MethodImplOptions.Synchronized is not recommended because it use public lock. We use usually use private lock (object readonly _lock = new object();) so no one except the maintainer of this object can lock on it, thus preventing dead lock (and preventing other people accusing your code of a bug because other people locked your instance of class without knowing you do the same internally)

Multithreading: difference between types of locking objects

Please explain the difference between these two types of locking.
I have a List which I want to access thread-safe:
var tasks = new List<string>();
1.
var locker = new object();
lock (locker)
{
tasks.Add("work 1");
}
2.
lock (tasks)
{
tasks.Add("work 2");
}
My thoughts:
Prevents two different threads from running the locked block of code at the same time.
But if another thread runs a different method where it tries to access task - this type of lock won't help.
Blocks the List<> instance so other threads in other methods will be blocked untill I unlock tasks.
Am I right or mistaking?
(2) only blocks other code that explicitly calls lock (tasks). Generally, you should only do this if you know that tasks is a private field and thus can enforce throughout your class that lock (tasks) means locking operations on the list. This can be a nice shortcut when the lock is conceptually linked with access to the collection and you don't need to worry about public exposure of the lock. You don't get this 'for free', though; it needs to be explicitly used just like locking on any other object.
They do the same thing. Any other code that tries to modify the list without locking the same object will cause potential race conditions.
A better way might be to encapsulate the list in another object that obtains a lock before doing any operations on the underlying list and then any other code can simple call methods on the wrapper object without worrying about obtaining the lock.

.net Lock - Two Questions

Two questions about the Lock() construct in .net
First, I am aware that if an object is locked within one class and another class attempts to lock the same object this produces a deadlock. But why? I have read about it on MSDN but MSDN is rarely all that clear.
----Edit Question One----
Still confused. I have a main thread (UI thread) that spawns many Threadpool threads. Each child thread locks the data before it works with it. This works fine every time.
If I then attempt to lock the same data from the UI thread to check if I should even bother creating a new thread for an edge case I create deadlock nearly every time.
----Edit Question Two----
Secondly, If I have a compound object that I lock are all child objects within it locked as well? Short code Demo:
internal sealed class Update
{
//Three objects instantiated via other external assemblies
public DataObject One { get; set; }
public DataObject Two { get; set; }
public ReplayStatus Status { get; set; }
}
If I call lock(UpdateObject) are each of the three internal objects and all of there child objects locked as well?
So I should do somthing like this to prevent threads from playing with my data objects:
Lock(UpdateObject.One)
{
Lock(UpdateObject.Two)
{
Lock(UpdateObject.Status)
{
//Do Stuff
}
}
}
First, I am aware that if an object is locked within one class and another class attempts to lock the same object this produces a deadlock.
No. If one thread locks an object and a second thread attempts to lock that object, that second thread must wait for the first thread to exit the lock.
Deadlock is something else:
1. thread1 locks instanceA
2. thread2 locks instanceB
3. thread1 attempts to lock instanceB and now must wait on thread2
4. thread2 attempts to lock instanceA and now must wait on thread1
These two threads can no longer execute, and so never release their locks. What a mess.
If I call lock(UpdateObject) are each of the three internal objects and all of there child objects locked as well?
No, the "lock" is only on the locked instance. Note: the lock doesn't prevent anything other than a second thread from acquiring a lock on that instance at the same time.
First, the whole point of a lock is that two sections of code can't get ahold of the same lock at once. This is to coordinate multiple threads working with the same stuff without interfering with each other. If you have a lock on an object, then anyone else that tries to get the lock will block (wait) until the original lock is released (only one thread can have the lock at any given time). You only have a deadlock if the first thread never gives up the lock, or if both threads are waiting for something from each other and neither can proceed until each gets what it's waiting for.
Second, if you lock an object in C#, you're not really "locking" the object in any semantic sense. You're acquiring a "lock" on the object (which you later release or relenquish). The object is purely a convenient token that is used to uniquely identify which lock you wish to obtain. So no, a lock on an object does not create a lock on any sub-parts of that object.

Is it good practice to lock a thread in order to make things transactional in winforms?

This is a old school winforms application that I am working with, and they design pattern that was used is as follows:
Whenever you need to make things transactional, a operation is performed on its own thread, and the thread is locked (a specific lock object is used for each operation), and then a call is made to the wcf service, some local objects are updated, then the lock is released.
Is this good practise?
Yes, but be careful of multithreading and have a good read on it as too many locks might create a deadlock situation.
I don't quite know what you mean, "lock a thread." Is it something like this?
static object ThreadLock = new object();
void ThreadProc(object state)
{
lock (ThreadLock)
{
// do stuff here
}
}
If so, there's nothing really wrong with that design. Your UI thread spawns a thread that's supposed to execute that code, and the lock prevents multiple threads from executing concurrently. It's a little bit wasteful in that you could potentially have many threads queued up behind the lock, but in practice you probably don't have more than one or two threads waiting. There are more efficient ways to do it (implement a task queue of some sort), but what you have is simple and effective.
As long as you are not waiting on multiple lock objects, this should be fine. Deadlock occurs when you have a situation like this:
Thread A:
lock (lockObject1)
{
// Do some stuff
lock (lockObject2)
{
// Do some stuff
}
}
Thread B:
lock (lockObject2)
{
// Do some stuff
lock (lockObject1)
{
// Do some stuff
}
}
If you happen to lock lockObject1 in thread A, and thread B locks lockObject2 before thread A locks it, then both threads will be waiting for an object that is locked in another thread, and neither will unlock because each is waiting while having an object locked. This is an oversimplified example -- there are many ways one can end up in this situation.
To avoid deadlock, do not wait on a second object while you have a first object locked. If you lock one object at a time like this, you can't get deadlocked because eventually, the locking thread will release the object a waiting thread needs. So for example, the above should be unrolled:
Thread A:
lock (lockObject1)
{
// Do some stuff
}
lock (lockObject2)
{
// Do some stuff
}
Thread B:
lock (lockObject2)
{
// Do some stuff
}
lock (lockObject1)
{
// Do some stuff
}
In this case, each lock operation will complete without trying to acquire another resources, and so deadlock is avoided.
This is not making the action transactional. I would take that to mean that either the entire operation succeeds or it has no effect -- if I update two local object inside your synchronization block, an error with the second does not rollback changes to the first.
Also, there is nothing stopping the main thread from using the two objects while they are being updated -- it needs to cooperate by also locking.
Locking in the background thread is only meaningful if you also lock when you use those objects in the main thread.

Is there any way to determine the number of threads waiting to lock in C#?

I'm using simple locking in C# using the lock statement. Is there any way to determine how many other threads are waiting to get a lock on the object? I basically want to limit the number of threads that are waiting for a lock to 5. My code would throw an exception if a sixth thread needs to get a lock.
This can be easily accomplished via the Semaphore class. It will do the counting for you. Notice in the code below that I use a semaphore to do a non-blocking check of the number of threads waiting for the resource and then I use a plain old lock to actually serialize access to that resource. An exception is thrown if there are more than 5 threads waiting for the resource.
public class YourResourceExecutor
{
private Semaphore m_Semaphore = new Semaphore(5, 5);
public void Execute()
{
bool acquired = false;
try
{
acquired = m_Semaphore.WaitOne(0);
if (!acquired)
{
throw new InvalidOperationException();
}
lock (m_Semaphore)
{
// Use the resource here.
}
}
finally
{
if (acquired) m_Semaphore.Release();
}
}
}
There is one notable variation of this pattern. You could change the name of the method to TryExecute and have it return a bool instead of throwing an exception. It is completely up to you.
Remember that the object used in the lock expression is not the subject of the lock. It merely serves as an identifier for a synchronized block of code. Any code blocks that acquire locks using the same object will effectively be serialized. It is the code block that is being "locked", not the object used in the lock expression
The lock statement is a shortcut for Monitor.Enter and Monitor.Exit. I do not think, that you have a chance to get the number of waiting objects.
You can use a simple shared counter(integer) that increments before the lock statement. If the value is equal to 5 then have your thread avoid the lock statement. The challenge however is that you will need to lock the counter to ensure the increment operation is atomic.
No, lock() uses the Monitor class and that has no member for finding out the nr of queued threads.
You can specify a time-out.
And frankly, throwing an Exception when a queue fills up sounds like a bad idea.

Categories

Resources