Question regarding locking, with text referring to the sample code below...I have a class (Class1) which provides a public List property called Class1Resources. 2 methods in Class1 provide basic query capability on Class1Resources. In addition, Class1 also subscribes to an event from a different service which provides notification that Class1 should update this Class1Resources object.
My question is, what and where should locking be implemented such that the 2 public methods which query Class1Resources are blocked when ExternalAppCallback is executing, thereby ensuring that the query methods are always using the most current data? Is the commented code I have in ExternalAppCallback the proper way to do this?
public class Class1
{
public List<Resource> Class1Resources { get; private set; }
public Class1()
{
// subscribe to external app event, with callback = ExternalAppCallback
}
private void ExternalAppCallback(List<Resource> updatedResourceList)
{
// do I put the lock here as in the code below?
//lock(someObject)
//{
// Class1Resources = new List<Resource>(updatedResourceList);
//}
Class1Resources = new List<Resource>(updatedResourceList);
}
public List<Resource> GetResourcesByCriteria1(string criteria1)
{
return Class1Resources.Where(r => r.Criteria1 == criteria1).ToList();
}
public List<Resource> GetResourcesByCriteria2(string criteria2)
{
return Class1Resources.Where(r => r.Criteria2 == criteria2).ToList();
}
}
I'm interpreting your question as "how do I effectively make Class1Resources thread safe?" and so I would recommend either a classic lock or, if you expect writes/changes to be seldom, a ReaderWriterLockSlim. Here is how you'd use a lock in your class to ensure thread safety / consistent data:
public class Class1
{
// Here's your object to lock on
private readonly object _lockObject = new object();
// NOTE: made this private to control how it is exposed!
private List<Resource> Class1Resources = null;
public Class1()
{
// subscribe to external app event, with callback = ExternalAppCallback
}
private void ExternalAppCallback(List<Resource> updatedResourceList)
{
// Setting a reference is always atomic, no need to lock this
Class1Resources = new List<Resource>(updatedResourceList);
}
// Your new method to expose the list in a thread-safe manner
public List<Resource> GetResources()
{
lock (_lockObject)
{
// ToList() makes a copy of the list versus maintaining the original reference
return Class1Resources.ToList();
}
}
public List<Resource> GetResourcesByCriteria1(string criteria1)
{
lock (_lockObject)
{
return Class1Resources.Where(r => r.Criteria1 == criteria1).ToList();
}
}
public List<Resource> GetResourcesByCriteria2(string criteria2)
{
lock (_lockObject)
{
return Class1Resources.Where(r => r.Criteria2 == criteria2).ToList();
}
}
}
Note that in this solution, anything calling the getter of your property will not be using the lock and thus will cause thread safety issues. This is why I changed the code to make it a private member.
Related
have another confusion about locking in C#.
The problem is sharing a state among diffrent threads.
Following scenario:
A thread processes a state machine. This machine, for example, counts up a value with delay. The machine thread reads this value and increments. Other threads shall now be able to read this count value. Actually I have more values than just a a single count, so I prepared a class which holds the shared values. This looks like this:
IStatus is a suitable private interface. Summaries are removed since not in English.
public class Status : ICloneable, IStatus
{
private object locker;
public bool Run { get; private set; }
public uint SecondRemain { get; private set; }
// ... and some more value types
public Status()
{
locker = new object();
}
void IStatus.SetRun(bool enable)
{
lock (locker)
{
Run = enable;
}
}
void IStatus.SetSecondRemain (uint value)
{
lock (locker)
{
SecondRemain = value;
}
}
// ... and some more set of value types
public object Clone()
{
object copy;
lock (locker)
{
copy = MemberwiseClone();
// Since it is a "new" object, we decouple the lock
((Status)copy).locker = new object();
}
return copy;
}
In main class where the state machine and its thread lives in:
private Status shared;
// Shall be accessible by any thread at any time
public Status GetStatus()
{
// Clones thread-safe, fenced by locker
return (Status) shared.Clone();
}
// Will be accessed only by inner thread
private void AnyMethodCalledStateMachineThread()
{
// Get the current remaining second value.
// No fence here ??
uint value = shared.SecondRemain;
// fences by locker
((IStatus) shared).SetSecondRemain (value++);
}
What we see now is, only the inner thread will read and write to some values. This thread will read what it has previously written. To ensure other threads can read this, it is locked.
External threads can only get a full copy by lock statement.
But does internal thread itself needs lock, too, when reading single properties or do I have to put lock around these properties (with extra field)?
EDIT
This is the requested sort of code which runs the thread
var thread;
private void Start()
{
thread = new Thread(new ThreadStart(ProducerMain));
thread.Start();
}
// Example, no real code
private void ProducerMain()
{
while ( ...)
{
Thread.Sleep(1000);
AnyMethodCalledStateMachineThread();
}
}
Is there a built-in ThreadLocal<T>-like construct for sharing an object within each unique thread but recreating it if the original value was disposed/destructed/teared down/nulled?
Here's my attempt at implementing such behaviour with ConcurrentDictionary (the ThreadLocalDisposable2 below), but I was hoping to just use ThreadLocal<T> (as in ThreadLocalDisposable1), however I can't get the Foo test to pass, .Values.Remove(this) doesn't do what I was hoping it would do and still causes ObjectDisposedException.
public class Class1
{
[Test]
public void Foo()
{
using (var foo = ThreadLocalDisposable1.Get())
foo.Foo();
using (var foo = ThreadLocalDisposable1.Get())
foo.Foo();
}
[Test]
public void Bar()
{
using (var bar = ThreadLocalDisposable2.Get())
bar.Foo();
using (var bar = ThreadLocalDisposable2.Get())
bar.Foo();
}
}
[1]
public class ThreadLocalDisposable1 : IDisposable
{
private Stream _foo;
private static ThreadLocal<ThreadLocalDisposable1> _thread;
static ThreadLocalDisposable1()
{
_thread = new ThreadLocal<ThreadLocalDisposable1>(() => new ThreadLocalDisposable1(), true);
}
private ThreadLocalDisposable1()
{
_foo = new MemoryStream();
}
public static ThreadLocalDisposable1 Get()
{
return _thread.Value;
}
public void Foo()
{
_foo.WriteByte(1);
}
public void Dispose()
{
//I do not think it means what I think it means
_thread.Values.Remove(this);
_foo.Dispose();
}
}
[2]
public class ThreadLocalDisposable2 : IDisposable
{
private Stream _foo;
private int _thread;
private static ConcurrentDictionary<int, ThreadLocalDisposable2> _threads;
static ThreadLocalDisposable2()
{
_threads = new ConcurrentDictionary<int, ThreadLocalDisposable2>();
}
private ThreadLocalDisposable2(int thread)
{
_thread = thread;
_foo = new MemoryStream();
}
public static ThreadLocalDisposable2 Get()
{
return _threads.GetOrAdd(Thread.CurrentThread.ManagedThreadId, i => new ThreadLocalDisposable2(i));
}
public void Foo()
{
_foo.WriteByte(1);
}
public void Dispose()
{
ThreadLocalDisposable2 thread;
_threads.TryRemove(_thread, out thread);
_foo.Dispose();
}
}
Edit:
Just to clarify what I mean, basically I want all of the behaviour of ThreadLocal but when I call Dispose (on the value, the ThreadLocalDisposable* with underlying Stream in this example, not the static ThreadLocal itself) take that disposed instance out of circulation, i.e. if called upon again -- create a new value as if it's a brand new thread requiring a brand new instance.
The ThreadLocalDisposable1, [1], is sample class of what I think should've worked, except the .Values.Remove(this) line doesn't "take it out of circulation" and forces a new instance to be created for that thread.
The ThreadLocalDisposable2, [2], with ConcurrentDictionary, is a way I implemented alternative to ThreadLocal with "take out of circulation" behaviour I'm after.
Edit:
This is not the a real use case I have, just a general example I can think of, but if you have for example a static ThreadLocal<SqlConnection>, or a socket, and it's forcefully closed (and disposed in final block) -- drop that connection instance and create a new one transparently if called again.
It seems like you're making this much harder than it has to be. Consider this:
public class MyClass: IDisposable
{
private Stream _foo;
public MyClass Get()
{
if (_foo == null)
{
_foo = new MemoryStream();
}
}
public void Foo()
{
_foo.WriteByte(1);
}
public void Dispose()
{
if (_foo != null)
{
_foo.Dispose();
_foo = null;
}
}
}
Now, you can create one of those:
ThreadLocal<MyClass> MyThing = new ThreadLocal<MyClass>();
And you can write:
using (MyThing.Value.Get())
{
// do stuff
}
That seems functionally equivalent to what you're trying to do with your ConcurrentDictionary stuff.
That said, it seems like this is something that would be better managed another way. I don't know your application so I can't say for sure, but it seems like a bad idea to have a stateful object like a Stream or SqlConnection as a global variable. Usually those things are job-specific rather than thread-specific, and as such should be passed as parameters when you start the job.
Question, Let's say I had Thread A and Thread B and both of these needed access to a singleton object and it's properties.
Currently the singleton looks as follows.
public class Singleton{
#region fields
private static Singleton singletonObject;
private double value1= 0;
private double value2= 0;
private double value3= 0;
private double value4= 0;
private object locker = null;
#endregion
// private constructor. This will avoid creating object using new keyword
private Singleton() {
locker = new object();
}
// public method which will be called
public void GetName() {
Console.WriteLine("singleton Object");
}
public static Singleton Instance() {
// this object will be used with lock, so that it will be always one thread which will be executing the code
object instanceLocker = new object();
// put a lock on myObject. We won't be able to use singleTonObject becuase it will be null. lock is to make the object thread safe.
// lock can't be worked with null objects.
lock (instanceLocker) {
// check whether the instance was there. If it's not there, then create an instance.
if (singletonObject == null) {
singletonObject = new Singleton();
}
}
return singletonObject;
}
public double Value1 { get { lock (locker) { return value1; } } set { lock (locker) { value1= value; } } }
public double Value2 { get { lock (locker) { return value2; } } set { lock (locker) { value2= value; } } }
public double Value3 { get { lock (locker) { return value3; } } set { lock (locker) { value3= value; } } }
public double Value4 { get { lock (locker) { return value4; } } set { lock (locker) { value4= value; } } }
}
My question. Rather than having thread safe properties, is there a better approach?
Thanks,
Currently your code is completely broken. You're creating a new object to lock on during every call. No other thread will ever know about it, so it's completely pointless.
Don't bother trying to fix it in clever ways. Just initialize it in the static variable initializer:
private static Singleton singletonObject = new Singleton();
Nice and simple.
For more information about implementing the singleton pattern in C# (including using Lazy<T> in .NET 4), see my article on the topic.
Aside from the fact that you're creating a new object to lock on for every call, there is another fundamental problem: even if you do have the same object, you're still not really protecting anything.
Somewhere along the line you initialize Value1 to 9:
Singleton.Instance().Value1 = 9;
Now let's say you have two threads executing this code:
public void Foo()
{
Singleton.Instance().Value1++;
if(Singleton.Instance().Value1==10.0)
{
Singleton.Instance().Value2 = 20.0;
}
else
{
Singleton.Instance().Value3 = 30.0;
}
}
Thread A calls Value1++ and incrementing value1 to 10.0
Thread B calls Value1++ and now the value1 is 11.0
Thread A checks if the value value1 is 10.0 -> returns false!
Thread A sets Value3 to 30
Thread B sets Value3 to 30 also.
This is just a very simple example where locking the properties will not protect you since the external code does nothing to guarantee the order in which things are being read or written. There could be a number of other orders in which Thread A and Thread B are executed which will result in completely different outcomes.
This behavior may be OK, since you could have let the user of the Singleton class take the responsibility for ensuring the correct operation outside your class, but it's generally something you should be aware of. Simply locking the properties will not eliminate the read/write contention.
Are you using .NET 4.0? Instead of locking, you can use ConCurrent collections for thread safe activity.
I post my understanding of C# lock as follows, please help me validate whether or not I get it right.
public class TestLock
{
private object threadLock = new object();
...
public void PrintOne()
{
lock (threadLock)
{
// SectionOne
}
}
public void PrintTwo()
{
lock (threadLock)
{
// SectionTwo
}
}
...
}
Case I> Thread1 and Thread2 simultaneously try to call PrintOne.
Since PrintOne is guarded by the instance lock, at any time, only
one thread can exclusively enter the SectionOne.
Is this correct?
Case II> Thread1 and Thread2 simultaneously try to call PrintOne and PrintTwo
respectively (i.e. Thread1 calls PrintOne and Thread2 calls PrintTwo)
Since two print methods are guarded by the same instance lock, at any time,
only one thread can exclusively access either SectionOne or SectionTwo, but NOT both.
Is this correct?
1 and 2 are true only if all your threads use the same instance of the class. If they use different instances, then both cases are false
Sample
public class TestLock
{
private object threadLock = new object();
public void PrintOne()
{
lock (threadLock)
{
Console.WriteLine("One");
var f = File.OpenWrite(#"C:\temp\file.txt"); //same static resource
f.Close();
}
}
public void PrintTwo()
{
lock (threadLock)
{
Console.WriteLine("Two");
var f = File.OpenWrite(#"C:\temp\file.txt"); //same static resource
f.Close();
}
}
}
And testing code
static void Main(string[] args)
{
int caseNumber = 100;
var threads = new Thread[caseNumber];
for (int i = 0; i < caseNumber; i++)
{
var t = new Thread(() =>
{
//create new instance
var testLock = new TestLock();
//for this instance we safe
testLock.PrintOne();
testLock.PrintTwo();
});
t.Start();
//once created more than one thread, we are unsafe
}
}
One of the possible solutions is to add a static keyword to the locking object declaration and methods that use it.
private static object threadLock = new object();
UPDATE
Good point made by konrad.kruczynski
..."thread safety" is also assumed from
context. For example, I could take
your file opening code and also
generate exception with static lock -
just taking another application
domain. And therefore propose that OP
should use system-wide Mutex class or
sth like that. Therefore static case
is just inferred as the instance one.
Case I: Check ✓
Case II: Check ✓
Don't forget that locking is only one way of thread synchronization. For other userfull methods, read: Thread Synchronization
Straight from MSDN sample:
public class TestThreading
{
private System.Object lockThis = new System.Object();
public void Process()
{
lock (lockThis)
{
// Access thread-sensitive resources.
}
}
}
Yes and yes. Cases are correct.
Your understanding is 100% correct. So if, for instance, you wanted to allow entry into the two methods separately you would want to have two locks.
Yes, you're correct in both counts.
here are the basics (more or less)
1) use instance locks for instance data
public class InstanceOnlyClass{
private int callCount;
private object lockObject = new object();
public void CallMe()
{
lock(lockObject)
{
callCount++;
}
}
}
2) use static locks for static data
public class StaticOnlyClass{
private int createdObjects;
private static object staticLockObject = new object();
public StaticOnlyClass()
{
lock(staticLockObject)
{
createdObjects++;
}
}
}
3) if you are protecting static and instance data use separate static and instance locks
public class StaticAndInstanceClass{
private int createdObjects;
private static object staticLockObject = new object();
private int callCount;
private object lockObject = new object();
public StaticAndInstanceClass()
{
lock(staticLockObject)
{
createdObjects++;
}
}
public void CallMe()
{
lock(lockObject)
{
callCount++;
}
}
}
based on this your code is fine if you are accessing instance data but unsafe if you are modifying static data
I've been using this pattern to initialize static data in my classes. It looks thread safe to me, but I know how subtle threading problems can be. Here's the code:
public class MyClass // bad code, do not use
{
static string _myResource = "";
static volatile bool _init = false;
public MyClass()
{
if (_init == true) return;
lock (_myResource)
{
if (_init == true) return;
Thread.Sleep(3000); // some operation that takes a long time
_myResource = "Hello World";
_init = true;
}
}
public string MyResource { get { return _myResource; } }
}
Are there any holes here? Maybe there is a simpler way to do this.
UPDATE: Consensus seems to be that a static constructor is the way to go. I came up with the following version using a static constructor.
public class MyClass
{
static MyClass() // a static constructor
{
Thread.Sleep(3000); // some operation that takes a long time
_myResource = "Hello World";
}
static string _myResource = null;
public MyClass() { LocalString = "Act locally"; } // an instance constructor
// use but don't modify
public bool MyResourceReady { get { return _myResource != null; } }
public string LocalString { get; set; }
}
I hope this is better.
You can use static constructors to intialize your static variables, which C# guarantees will only be called once within each AppDomain. Not sure if you considered them.
So you can read this: http://msdn.microsoft.com/en-us/library/aa645612(VS.71).aspx (Static Constructors)
And this: Is the C# static constructor thread safe?
Performing a lock() on _myResource and changing it inside lock() statement seems like a bad idea.
Consider following workflow:
thread 1 calls MyClass().
execution stops before line _init = true; right after assigning _myResource.
processor switches to thread 2.
thread 2 calls MyClass(). Since _init is still false and refrence _myResource changed, it succesfully enters lock() statement block.
_init is still false, so thread 2 reassigns _myResource.
Workaround: create a static object and lock on this object instead of initialized resource:
private static readonly object _resourceLock = new object();
/*...*/
lock(_resourceLock)
{
/*...*/
}
Your class is not safe:
You change the object you're locking on after you've locked on it.
You have a property that gets the resource without locking it.
You lock on a primitive type, which is generally not a good practice.
This should do it for you:
public class MyClass
{
static readonly object _sync = new object();
static string _myResource = "";
static volatile bool _init = false;
public MyClass()
{
if (_init == true) return;
lock (_sync)
{
if (_init == true) return;
Thread.Sleep(3000); // some operation that takes a long time
_myResource = "Hello World";
_init = true;
}
}
public string MyResource
{
get
{
MyClass ret; // Correct
lock(_sync)
{
ret = _myResource;
}
return ret;
}
}
}
Update:
Correct, the static resource should not be returned directly... I've corrected my example accordingly.
Depending on your use case (i.e. if threads don't need to pass information to each other using this variable), marking the member variable as [ThreadStatic] may be a solution.
See here.
static string _myResource = "";
...
public MyClass()
{
...
lock (_myResource)
{
}
}
Due to string interning, you should not lock on a string literal. If you lock on a string literal and that string literal is used by multiple classes then you may be sharing that lock. This can potentially cause unexpected behavior.