Why do my threads become deadlocked C#? - c#

I'm having problems with multi-threading in an application I'm working on at the minute.
The process basically involves a list of items which need to be processed. As part of this processing a call needs to be made to a 3rd party api which does not support multi threading.
I've attempted to introduce a singleton instance of the API class and use locking to ensure that only one thread makes a call to it at once but I still get a situation were one thread gets stuck on the call to the API and the others are then stuck waiting on the lock to be released.
If I pause the debug session and check the callstack for the threads the one that has made it to the API call has the following trace:
mscorlib.dll!System.Threading.WaitHandle.WaitAll(System.Threading.WaitHandle[] waitHandles, int millisecondsTimeout, bool exitContext)
mscorlib.dll!System.Threading.WaitHandle.WaitAll(System.Threading.WaitHandle[] waitHandles)
I've tested this on a single thread by swapping out the thread pool in the foreach loop with an explicit call to the Process method and it works fine (although slower than I would like, there is quite a lot of processing before and after the API call).
Am I doing something wrong here or is this an issue with the third party api?
public class MyClass
{
private static ThirdPartyApi ApiInstance;
private static object lockObject = new object();
...
public void DoWork(list)
{
...
foreach (var item in list)
{
ThreadPool.QueueUserWorkItem(Process, item);
}
...
}
public void Process(string item)
{
// Various processing
...
lock(lockObject)
{
var result = ApiInstance.Lookup(item);
}
...
}

Code that is thread unsafe doesn't necessarily mean that the methods are not re-entrant, some thread-unsafe libraries require all calls to come from the same thread, period. Try the following method using a BlockingCollection instead, which will issue all calls on the same thread and see if it resolves the issue.
public class MyClass<T>
{
private BlockingCollection<T> workQueue = new BlockingCollection<T>();
public MyClass()
{
Task.Factory.StartNew(ProcessWorkQueue, TaskCreationOptions.LongRunning);
}
public void DoWork(List<T> work)
{
foreach (var workItem in work)
{
workQueue.Add(workItem);
}
}
public void StopWork()
{
workQueue.CompleteAdding();
}
public void ProcessWorkQueue()
{
foreach(var item in workQueue.GetConsumingEnumerable())
{
//Do something here
}
}
}
Also, the ThreadPool is a shared resource and performing any blocking action on a Threadpool thread can exhaust it. Even if your code did work, it would need to be refactored to address this resource starvation issue.

Related

Is there a version of Semaphore Slim or another method that will let the same thread in downstream?

I am refactoring older synchronous C# code to use an async library. The current synchronous code makes liberal usage of locks. Outer methods often call inner methods, where both lock on the same objects. These are often "protected objects" defined in the base class and locked upon in base virtual methods and the overrides that call the base. For synchronous code, that's ok as the thread entering the outer/override method lock can also enter the inner/base method one. That is not the case for async / SemaphoreSlim(1,1)s.
I'm looking for a robust locking mechanism I can use in the async world that will allow subsequent downstream calls to the same locking object, to enter the lock, as per the behaviour in synchronous "lock {...}" syntax. The closest I have come is semaphore slim, but it is too restrictive for my needs. It restricts access not only to other threads, but to the same thread requesting entrance in the inner call too. Alternatively, is there a way to know that the thread is already "inside" the semaphore before calling the inner SemaphoreSlim.waitasync()?
Answers questioning the design structure of the inner/outer methods both locking on the same object are welcome (I question it myself!), but if so please propose alternative options. I have thought of only using private SemaphoreSlim(1,1)s, and having inheritors of the base class use their own private semaphores. But it gets tricky to manage quite quickly.
Sync Example: Because the same thread is requesting entrance to the lock in both inner and outer, it lets it in and the method can complete.
private object LockObject = new object();
public void Outer()
{
lock (LockObject)
{
foreach (var item in collection)
{
Inner(item);
}
}
}
public void Inner(string item)
{
lock (LockObject)
{
DoWork(item);
}
}
Async Example: The semaphore doesn't work like that, it will get stuck at the first iteration of inner async because it's just a signal, it doesn't let another one pass until it is released, even if the same thread requests it
protected SemaphoreSlim LockObjectAsync = new SemaphoreSlim(1,1);
public async Task OuterAsync()
{
try
{
await LockObjectAsync.WaitAsync();
foreach (var item in collection)
{
await InnerAsync(item);
}
}
finally
{
LockObjectAsync.Release();
}
}
public async Task InnerAsync(string item)
{
try
{
await LockObjectAsync.WaitAsync();
DoWork(item);
}
finally
{
LockObjectAsync.Release();
}
}
I am in full agreement with Servy here:
Reentrancy like this should generally be avoided even in synchronous code (it usually makes it easier to make mistakes).
Here's a blog post on the subject I wrote a while ago. Kinda long-winded; sorry.
I'm looking for a robust locking mechanism I can use in the async world that will allow subsequent downstream calls to the same locking object, to enter the lock, as per the behaviour in synchronous "lock {...}" syntax.
TL;DR: There isn't one.
Longer answer: An implementation exists, but I wouldn't use the word "robust".
My recommended solution is to refactor first so that the code no longer depends on lock re-entrancy. Make the existing code use SemaphoreSlim (with synchronous Waits) instead of lock.
This refactoring isn't extremely straightforward, but a pattern I like to use is to refactor the "inner" methods into private (or protected if necessary) implementation methods that are always executed under lock. I strongly recommend these inner methods follow a naming convention; I tend to use the ugly-but-in-your-face _UnderLock. Using your example code this would look like:
private object LockObject = new();
public void Outer()
{
lock (LockObject)
{
foreach (var item in collection)
{
Inner_UnderLock(item);
}
}
}
public void Inner(string item)
{
lock (LockObject)
{
Inner_UnderLock(item);
}
}
private void Inner_UnderLock(string item)
{
DoWork(item);
}
This gets more complex if there are multiple locks, but for simple cases this refactoring works well. Then you can replace the reentrant locks with non-reentrant SemaphoreSlims:
private SemaphoreSlim LockObject = new(1);
public void Outer()
{
LockObject.Wait();
try
{
foreach (var item in collection)
{
Inner_UnderLock(item);
}
}
finally
{
LockObject.Release();
}
}
public void Inner(string item)
{
LockObject.Wait();
try
{
Inner_UnderLock(item);
}
finally
{
LockObject.Release();
}
}
private void Inner_UnderLock(string item)
{
DoWork(item);
}
If you have many of these methods, look into writing a little extension method for SemaphoreSlim that returns IDisposable, and then you end up with using blocks that look more similar to the old lock blocks instead of having try/finally everywhere.
The not-recommended solution:
As canton7 suspected, an asynchronous recursive lock is possible, and I have written one. However, that code has never been published nor supported, nor will it ever be. It hasn't been proven in production or even fully tested. But it does, technically, exist.

How check "IsAlive" status of c# async Thread? [duplicate]

This question already has answers here:
How to wait for async method to complete?
(7 answers)
Closed 2 years ago.
Let's say I have a MyThread class in my Windows C# app like this:
public class MyThread
{
Thread TheThread;
public MyThread()
{
TheThread = new Thread(MyFunc);
}
public void StartIfNecessary()
{
if (!TheThread.IsAlive)
TheThread.Start();
}
private void MyFunc()
{
for (;;)
{
if (ThereIsStuffToDo)
DoSomeStuff();
}
}
}
That works fine. But now I realize I can make my thread more efficient by using async/await:
public class MyThread
{
Thread TheThread;
public MyThread()
{
TheThread = new Thread(MyFunc);
}
public void StartIfNecessary()
{
if (!TheThread.IsAlive)
TheThread.Start();
}
private async void MyFunc()
{
for (;;)
{
DoSomeStuff();
await MoreStuffIsReady();
}
}
}
What I see now, is that the second time I call StartIfNecessary(), TheThread.IsAlive is false (and ThreadState is Stopped BTW) so it calls TheThread.Start() which then throws the ThreadStateException "Thread is running or terminated; it cannot restart". But I can see that DoMoreStuff() is still getting called, so the function is in fact still executing.
I suspect what is happening, is that when my thread hits the "await", the thread I created is stopped, and when the await on MoreStuffIsReady() completes, a thread from the thread pool is assigned to execute DoSomeStuff(). So it is technically true that the thread I created has been stopped, but the function I created that thread to process is still running.
So how can I tell if "MyFunc" is still active?
I can think of 3 ways to solve this:
1) Add a "bool IsRunning" which is set to true right before calling TheThread.Start(), and MyFunc() sets to false when it completes. This is simple, but requires me to wrap everything in a try/catch/finally which isn't awful but I was hoping there was a way to have the operating system or framework help me out here just in case "MyFunc" dies in some way I wasn't expecting.
2) Find some new function somewhere in System.Threading that will give me the information I need.
3) Rethink the whole thing - since my thread only sticks around for a few milliseconds, is there a way to accomplish this same functionality without creating a thread at all (outside of the thread pool)? Start "MyFunc" as a Task somehow?
Best practices in this case?
Sticking with a Plain Old Thread and using BlockingCollection to avoid a tight loop:
class MyThread
{
private Thread worker = new Thread(MyFunc);
private BlockingCollection<Action> stuff = new BlockingCollection<Action>();
public MyThread()
{
worker.Start();
}
void MyFunc()
{
foreach (var todo in stuff.GetConsumingEnumerable())
{
try
{
todo();
}
catch(Exception ex)
{
// Something went wrong in todo()
}
}
stuff.Dispose(); // should be disposed!
}
public void Shutdown()
{
stuff.CompleteAdding(); // No more adding, but will continue to serve until empty.
}
public void Add( Action stuffTodo )
{
stuff.Add(stuffTodo); // Will throw after Shutdown is called
}
}
BlockingCollection also shows examples with Task if you prefer to go down that road.
Rethink the whole thing
This is definitely the best option. Get rid of the thread completely.
It seems like you have a "consumer" kind of scenario, and you need a consumer with a buffer of data items to work on.
One option is to use ActionBlock<T> from TPL Dataflow:
public class NeedsADifferentName
{
ActionBlock<MyDataType> _block;
public NeedsADifferentName() => _block = new ActionBlock<MyDataType>(MyFunc);
public void QueueData(MyDataType data) => _block.Post(data);
private void MyFunc(MyDataType data)
{
DoSomeStuff(data);
}
}
Alternatively, you can build your own pipeline using something like Channels.

synchronize two methods and avoid to run simultaneously

class1 has two methods, do1 and do2.
It is instanciated from multiple thread in the same application.
i need to synchronize two methods with this specs:
do1 can be executed only from 1 thread at time, lock would be a good solution.
do2 can be called from multiple thread at the same time but
it cannot be called when do1 is running.
When do1 is called evary thread must wait the completion of do1 before to start do2.
Thanks.
There are basically two ways to accomplish this. One is with a Semaphore with one method that takes all available slots with one method, and another that uses one slot per execution attempt. That's a bit of a hack, though, what you really need is a synchronization object that allows exclusive and non-exclusive locks.
That's what ReaderWriterLock and ReaderWriterLockSlim do. They're designed for cases where you need exclusive write access to a resource but non-exclusive read access to the resource, and work well for these sort of scenarios:
ReaderWriterLockSlim m_lock = new ReaderWriterLockSlim();
public string do2()
{
m_lock.EnterReadLock();
try
{
// Do work, many threads can enter this lock at the same time
}
finally
{
m_lock.ExitReadLock();
}
}
public void do1()
{
m_lock.EnterWriteLock();
try
{
// Do work, only one thread can be in here at once
}
finally
{
m_lock.ExitWriteLock();
}
}
If I understood the question correctly, the problem can be solved by simply using a private static lock object to synchronize the methods, which can be done as follows.
public class c1
{
private static iLock = new object();
public void do1()
{
lock (iLock)
{
// actual method body
}
}
public void do2()
{
lock (iLock)
{
// actual method body
}
}
}

How to create an asynchronous method

I have simple method in my C# app, it picks file from FTP server and parses it and stores the data in DB. I want it to be asynchronous, so that user perform other operations on App, once parsing is done he has to get message stating "Parsing is done".
I know it can achieved through asynchronous method call but I dont know how to do that can anybody help me please??
You need to use delegates and the BeginInvoke method that they contain to run another method asynchronously. A the end of the method being run by the delegate, you can notify the user. For example:
class MyClass
{
private delegate void SomeFunctionDelegate(int param1, bool param2);
private SomeFunctionDelegate sfd;
public MyClass()
{
sfd = new SomeFunctionDelegate(this.SomeFunction);
}
private void SomeFunction(int param1, bool param2)
{
// Do stuff
// Notify user
}
public void GetData()
{
// Do stuff
sfd.BeginInvoke(34, true, null, null);
}
}
Read up at http://msdn.microsoft.com/en-us/library/2e08f6yc.aspx
try this method
public static void RunAsynchronously(Action method, Action callback) {
ThreadPool.QueueUserWorkItem(_ =>
{
try {
method();
}
catch (ThreadAbortException) { /* dont report on this */ }
catch (Exception ex) {
}
// note: this will not be called if the thread is aborted
if (callback!= null) callback();
});
}
Usage:
RunAsynchronously( () => { picks file from FTP server and parses it},
() => { Console.WriteLine("Parsing is done"); } );
Any time you're doing something asynchronous, you're using a separate thread, either a new thread, or one taken from the thread pool. This means that anything you do asynchronously has to be very careful about interactions with other threads.
One way to do that is to place the code for the async thread (call it thread "A") along with all of its data into another class (call it class "A"). Make sure that thread "A" only accesses data in class "A". If thread "A" only touches class "A", and no other thread touches class "A"'s data, then there's one less problem:
public class MainClass
{
private sealed class AsyncClass
{
private int _counter;
private readonly int _maxCount;
public AsyncClass(int maxCount) { _maxCount = maxCount; }
public void Run()
{
while (_counter++ < _maxCount) { Thread.Sleep(1); }
CompletionTime = DateTime.Now;
}
public DateTime CompletionTime { get; private set; }
}
private AsyncClass _asyncInstance;
public void StartAsync()
{
var asyncDoneTime = DateTime.MinValue;
_asyncInstance = new AsyncClass(10);
Action asyncAction = _asyncInstance.Run;
asyncAction.BeginInvoke(
ar =>
{
asyncAction.EndInvoke(ar);
asyncDoneTime = _asyncInstance.CompletionTime;
}, null);
Console.WriteLine("Async task ended at {0}", asyncDoneTime);
}
}
Notice that the only part of AsyncClass that's touched from the outside is its public interface, and the only part of that which is data is CompletionTime. Note that this is only touched after the asynchronous task is complete. This means that nothing else can interfere with the tasks inner workings, and it can't interfere with anything else.
Here are two links about threading in C#
Threading in C#
Multi-threading in .NET: Introduction and suggestions
I'd start to read about the BackgroundWorker class
In Asp.Net I use a lot of static methods for jobs to be done. If its simply a job where I need no response or status, I do something simple like below. As you can see I can choose to call either ResizeImages or ResizeImagesAsync depending if I want to wait for it to finish or not
Code explanation: I use http://imageresizing.net/ to resize/crop images and the method SaveBlobPng is to store the images to Azure (cloud) but since that is irrelevant for this demo I didn't include that code. Its a good example of time consuming tasks though
private delegate void ResizeImagesDelegate(string tempuri, Dictionary<string, string> versions);
private static void ResizeImagesAsync(string tempuri, Dictionary<string, string> versions)
{
ResizeImagesDelegate worker = new ResizeImagesDelegate(ResizeImages);
worker.BeginInvoke(tempuri, versions, deletetemp, null, null);
}
private static void ResizeImages(string tempuri, Dictionary<string, string> versions)
{
//the job, whatever it might be
foreach (var item in versions)
{
var image = ImageBuilder.Current.Build(tempuri, new ResizeSettings(item.Value));
SaveBlobPng(image, item.Key);
image.Dispose();
}
}
Or going for threading so you dont have to bother with Delegates
private static void ResizeImagesAsync(string tempuri, Dictionary<string, string> versions)
{
Thread t = new Thread (() => ResizeImages(tempuri, versions, null, null));
t.Start();
}
ThreadPool.QueueUserWorkItem is the quickest way to get a process running on a different thread.
Be aware that UI objects have "thread affinity" and cannot be accessed from any thread other than the one that created them.
So, in addition to checking out the ThreadPool (or using the asynchronous programming model via delegates), you need to check out Dispatchers (wpf) or InvokeRequired (winforms).
In the end you will have to use some sort of threading. The way it basically works is that you start a function with a new thread and it will run until the end of the function.
If you are using Windows Forms then a nice wrapper that they have for this is call the Background Worker. It allows you to work in the background with out locking up the UI form and even provides a way to communicate with the forms and provide progress update events.
Background Worker
.NET got new keyword async for asonchrynous functions. You can start digging at learn.microsoft.com (async). The shortest general howto make function asonchrynous is to change function F:
Object F(Object args)
{
...
return RESULT;
}
to something like this:
async Task<Object> FAsync(Object args)
{
...
await RESULT_FROM_PROMISE;
...
return RESULT;
}
The most important thing in above code is that when your code approach await keyword it return control to function that called FAsync and make other computation until promissed value has been returned and procede with rest of code in function FAsync.

Does C# have a "ThreadLocal" analog (for data members) to the "ThreadStatic" attribute?

I've found the "ThreadStatic" attribute to be extremely useful recently, but makes me now want a "ThreadLocal" type attribute that lets me have non-static data members on a per-thread basis.
Now I'm aware that this would have some non-trivial implications, but:
Does such a thing exist already built into C#/.net? or since it appears so far that the answer to this is no (for .net < 4.0), is there a commonly used implementation out there?
I can think of a reasonable way to implement it myself, but would just use something that already existed if it were available.
Straw Man example that would implement what I'm looking for if it doesn't already exist:
class Foo
{
[ThreadStatic]
static Dictionary<Object,int> threadLocalValues = new Dictionary<Object,int>();
int defaultValue = 0;
int ThreadLocalMember
{
get
{
int value = defaultValue;
if( ! threadLocalValues.TryGetValue(this, out value) )
{
threadLocalValues[this] = value;
}
return value;
}
set { threadLocalValues[this] = value; }
}
}
Please forgive any C# ignorance. I'm a C++ developer that has only recently been getting into the more interesting features of C# and .net
I'm limited to .net 3.0 and maybe 3.5 (project has/will soon move to 3.5).
Specific use-case is callback lists that are thread specific (using imaginary [ThreadLocal] attribute) a la:
class NonSingletonSharedThing
{
[ThreadLocal] List<Callback> callbacks;
public void ThreadLocalRegisterCallback( Callback somecallback )
{
callbacks.Add(somecallback);
}
public void ThreadLocalDoCallbacks();
{
foreach( var callback in callbacks )
callback.invoke();
}
}
Enter .NET 4.0!
If you're stuck in 3.5 (or earlier), there are some functions you should look at, like AllocateDataSlot which should do what you want.
You should think about this twice. You are essentially creating a memory leak. Every object created by the thread stays referenced and can't be garbage collected. Until the thread ends.
If you looking to store unique data on a per thread basis you could use Thread.SetData. Be sure to read up on the pros and cons http://msdn.microsoft.com/en-us/library/6sby1byh.aspx as this has performance implications.
Consider:
Rather than try to give each member variable in an object a thread-specific value, give each thread its own object instance. -- pass the object to the threadstart as state, or make the threadstart method a member of the object that the thread will "own", and create a new instance for each thread that you spawn.
Edit
(in response to Catskul's remark.
Here's an example of encapsulating the struct
public class TheStructWorkerClass
{
private StructData TheStruct;
public TheStructWorkerClass(StructData yourStruct)
{
this.TheStruct = yourStruct;
}
public void ExecuteAsync()
{
System.Threading.ThreadPool.QueueUserWorkItem(this.TheWorkerMethod);
}
private void TheWorkerMethod(object state)
{
// your processing logic here
// you can access your structure as this.TheStruct;
// only this thread has access to the struct (as long as you don't pass the struct
// to another worker class)
}
}
// now hte code that launches the async process does this:
var worker = new TheStructWorkerClass(yourStruct);
worker.ExecuteAsync();
Now here's option 2 (pass the struct as state)
{
// (from somewhere in your existing code
System.Threading.Threadpool.QueueUserWorkItem(this.TheWorker, myStruct);
}
private void TheWorker(object state)
{
StructData yourStruct = (StructData)state;
// now do stuff with your struct
// works fine as long as you never pass the same instance of your struct to 2 different threads.
}
I ended up implementing and testing a version of what I had originally suggested:
public class ThreadLocal<T>
{
[ThreadStatic] private static Dictionary<object, T> _lookupTable;
private Dictionary<object, T> LookupTable
{
get
{
if ( _lookupTable == null)
_lookupTable = new Dictionary<object, T>();
return _lookupTable;
}
}
private object key = new object(); //lazy hash key creation handles replacement
private T originalValue;
public ThreadLocal( T value )
{
originalValue = value;
}
~ThreadLocal()
{
LookupTable.Remove(key);
}
public void Set( T value)
{
LookupTable[key] = value;
}
public T Get()
{
T returnValue = default(T);
if (!LookupTable.TryGetValue(key, out returnValue))
Set(originalValue);
return returnValue;
}
}
Although I am still not sure about when your use case would make sense (see my comment on the question itself), I would like to contribute a working example that is in my opinion more readable than thread-local storage (whether static or instance). The example is using .NET 3.5:
using System;
using System.Collections.Generic;
using System.Text;
using System.Threading;
using System.Linq;
namespace SimulatedThreadLocal
{
public sealed class Notifier
{
public void Register(Func<string> callback)
{
var id = Thread.CurrentThread.ManagedThreadId;
lock (this._callbacks)
{
List<Func<string>> list;
if (!this._callbacks.TryGetValue(id, out list))
{
this._callbacks[id] = list = new List<Func<string>>();
}
list.Add(callback);
}
}
public void Execute()
{
var id = Thread.CurrentThread.ManagedThreadId;
IEnumerable<Func<string>> threadCallbacks;
string status;
lock (this._callbacks)
{
status = string.Format("Notifier has callbacks from {0} threads, total {1} callbacks{2}Executing on thread {3}",
this._callbacks.Count,
this._callbacks.SelectMany(d => d.Value).Count(),
Environment.NewLine,
Thread.CurrentThread.ManagedThreadId);
threadCallbacks = this._callbacks[id]; // we can use the original collection, as only this thread can add to it and we're not going to be adding right now
}
var b = new StringBuilder();
foreach (var callback in threadCallbacks)
{
b.AppendLine(callback());
}
Console.ForegroundColor = ConsoleColor.DarkYellow;
Console.WriteLine(status);
Console.ForegroundColor = ConsoleColor.Green;
Console.WriteLine(b.ToString());
}
private readonly Dictionary<int, List<Func<string>>> _callbacks = new Dictionary<int, List<Func<string>>>();
}
public static class Program
{
public static void Main(string[] args)
{
try
{
var notifier = new Notifier();
var syncMainThread = new ManualResetEvent(false);
var syncWorkerThread = new ManualResetEvent(false);
ThreadPool.QueueUserWorkItem(delegate // will create closure to see notifier and sync* events
{
notifier.Register(() => string.Format("Worker thread callback A (thread ID = {0})", Thread.CurrentThread.ManagedThreadId));
syncMainThread.Set();
syncWorkerThread.WaitOne(); // wait for main thread to execute notifications in its context
syncWorkerThread.Reset();
notifier.Execute();
notifier.Register(() => string.Format("Worker thread callback B (thread ID = {0})", Thread.CurrentThread.ManagedThreadId));
syncMainThread.Set();
syncWorkerThread.WaitOne(); // wait for main thread to execute notifications in its context
syncWorkerThread.Reset();
notifier.Execute();
syncMainThread.Set();
});
notifier.Register(() => string.Format("Main thread callback A (thread ID = {0})", Thread.CurrentThread.ManagedThreadId));
syncMainThread.WaitOne(); // wait for worker thread to add its notification
syncMainThread.Reset();
notifier.Execute();
syncWorkerThread.Set();
syncMainThread.WaitOne(); // wait for worker thread to execute notifications in its context
syncMainThread.Reset();
notifier.Register(() => string.Format("Main thread callback B (thread ID = {0})", Thread.CurrentThread.ManagedThreadId));
notifier.Execute();
syncWorkerThread.Set();
syncMainThread.WaitOne(); // wait for worker thread to execute notifications in its context
syncMainThread.Reset();
}
finally
{
Console.ResetColor();
}
}
}
}
When you compile and run the above program, you should get output like this:
alt text http://img695.imageshack.us/img695/991/threadlocal.png
Based on your use-case I assume this is what you're trying to achieve. The example first adds two callbacks from two different contexts, main and worker threads. Then the example runs notification first from main and then from worker threads. The callbacks that are executed are effectively filtered by current thread ID. Just to show things are working as expected, the example adds two more callbacks (for a total of 4) and again runs the notification from the context of main and worker threads.
Note that Notifier class is a regular instance that can have state, multiple instances, etc (again, as per your question's use-case). No static or thread-static or thread-local is used by the example.
I would appreciate if you could look at the code and let me know if I misunderstood what you're trying to achieve or if a technique like this would meet your needs.
I'm not sure how you're spawning your threads in the first place, but there are ways to give each thread its own thread-local storage, without using hackish workarounds like the code you posted in your question.
public void SpawnSomeThreads(int threads)
{
for (int i = 0; i < threads; i++)
{
Thread t = new Thread(WorkerThread);
WorkerThreadContext context = new WorkerThreadContext
{
// whatever data the thread needs passed into it
};
t.Start(context);
}
}
private class WorkerThreadContext
{
public string Data { get; set; }
public int OtherData { get; set; }
}
private void WorkerThread(object parameter)
{
WorkerThreadContext context = (WorkerThreadContext) parameter;
// do work here
}
This obviously ignores waiting on the threads to finish their work, making sure accesses to any shared state is thread-safe across all the worker threads, but you get the idea.
Whilst the posted solution looks elegant, it leaks objects. The finalizer - LookupTable.Remove(key) - is run only in the context of the GC thread so is likely only creating more garbage in creating another lookup table.
You need to remove object from the lookup table of every thread that has accessed the ThreadLocal. The only elegant way I can think of solving this is via a weak keyed dictionary - a data structure which is strangely lacking from c#.

Categories

Resources