Unreferenced Thread object dispose - c#

If i use thread like this:
void foo()
{
new Thread().Start();
}
since the Thread object is not referenced, will it be disposed by GC before the designated work is done?

The thread should stay alive until its method(s) return.
Check out: What prevents a Thread in C# from being Collected?

From MSDN
It is not necessary to retain a
reference to a Thread object once you
have started the thread. The thread
continues to execute until the thread
procedure is complete.
The System.Threading.Thread class is really just there for bookkeeping/management. It isn't the actual mechanism that creates/maintains threads. That's managed by the runtime and is CLI implementation specific (for example, the Mono implementation may differ dramatically in thread management.)

Related

In which context will thread safe singletons run?

If a singleton which is accessed from multiple threads is used, and the singleton itself is threadsafe, which thread will block when the singleton is accessed?
For example thinking that there is a mainthread A . A first accessed the singleton S. Then does something else.
A bit later thread B accesses the singleton S.
If B accesses S, will the singleton still be in context of thread A and also block thread A or only thread B (and other ones trying to actually access it?)
-> accesses
A->S {}
A->X {}
B->S {
...
C-S
} - will B only block C or also A?
To answer to the questions:
thread safe singleton (stripped down):
private static volatile Singleton instance;
private static object _sync = new Object();
private Singleton() {
// dosomething...
}
public static Singleton Instance
{
get
{
if (instance == null)
{
lock (_sync)
{
if (instance == null)
instance = new Singleton();
}
}
return instance;
}
}
(and of cause locking in methods)
And the question is mostly specific to the following point:
I know that the lock will prevent multiple threads to access the same region of code. My question is specific to the following point:
If the original thread in which scope the Singleton was produced does not hold the lock, will it also be blocked if another thread accesses the lock, as the Singleton Instance was created in the scope? Will the singleton just only run in the scope of the original thread?
Usually, thread safety for a singleton means mutual exclusion. That is, if a thread needs to use the singleton, it must acquire a lock/token, do what it needs, and release the token. During the whole time it is holding the token, no other thread will be able to acquire it. Any thread that tries that will be blocked and placed in a FIFO queue and will receive the token as soon as the holder releases it. This ensures only one thread accesses the protected resource (a singleton object in this case) at a time.
This is the typical scenario; your mileage might vary.
On a related note, Singleton is considered a bad idea by most people.
Syncronization mechanisms for C# are covered in part 2 of the tutorial linked by makc, which is quite nice.
Thread safe normally means only one thread can access it at a time. Locks around critical sections will mean multiple threads trying to run that piece of code will be blocked and only one at a time can proceed and access it.
Let's assume in your question that the class is synchronised at the class level, then while A is calling methods on S any other thread trying to call S at the same time will have to wait until A is finished.
Once A has finished running S then all waiting threads can be re-scheduled and one of them will then acquire the lock and run S (blocking any remaining waiting threads).
Meanwhile...A can go ahead and run X while someone else is accessing S (unless they share the same lock).
Think of a lock - specifically a mutex in this example - as a token, only the thread holding the token can run the code it protects. once it's done it drops the token and the next thread that picks it up can proceed.
Typically your synchronisation is done at a finer-grained level than across the whole class, say on a specific method or a specific block of code within a method. This avoids threads wasting time waiting around when they could actually both access different methods that don't affect each other.
It'll depend on how thread-safe is your singleton or any other object.
For example, if you use a Monitor or Mutex, only one thread will have access to the protected code block by one of these threading synchronization approaches. Let's say one thread tries to enter a synchronized code block and some other thread acquired the lock: then the second thread will wait till the first releases the lock.
In the other hand, if you use a Semaphore, you'll define how many threads can pass through a protected code block. Let's say the Semaphore allows 8 threads at the same time. If a possible 9th thread tries to enter to the protected code block it'll wait until Semaphore notifies that there's one or more slots available for the queued threads.
There're different strategies when it comes to synchronize objects when using multi-threading.
Check this MSDN article:
http://msdn.microsoft.com/en-us/library/ms173178(v=vs.110).aspx
UPDATE
I've checked your code in your updated question body.
Definitively: yes. Any thread, even the main thread, will be blocked until the one that acquired the lock releases it
Imagine that this wouldn't be this way: some threading rules work for any thread excluding the main one. It would be a chance to have a non-synchronized code region.
lock statement compiles into something like a Monitor.Enter and Monitor.Exit. This means that the locked object acquires an exclusive lock for the current thread.
UPDATE 2
Taken from some OP comment:
Can you explain why? I mean if the main thread does nothing with the
singleton in the moment, then the thread does not try to get that lock?
Ooops! I feel you forgot something about how threading works!
When you protect a code region using a thread synchronization approach like Monitor (lock keyword uses a Monitor behind the scene), you're blocking any thread that tries to enter to the protected/synchronized object rather than blocking any working thread until the Monitor releases the lock.
Let's say there're two threads A and B and you've this code:
lock(_syncObject)
{
// Do some stuff
}
Thread A goes through the synchronized code region and B is a background worker that's doing some other stuff that won't go through the protected region. In this case, B won't be blocked.
In other words: when you synchronize threaded access to some region you're protecting an object. lock (or Mutex, AutoResetEvent or whatever) is not equivalent to something like an hypothetical Thread.SleepAll(). If any thread is started and working and no one goes through a synchronized object access, no thread will be blocked.

Thread safety in C# lambdas

I came across a piece of C# code like this today:
lock(obj)
{
// perform various operations
...
// send a message via a queue but in the same process, Post(yourData, callback)
messagingBus.Post(data, () =>
{
// perform operation
...
if(condition == true)
{
// perform a long running, out of process operation
operation.Perform();
}
}
}
My question is this: can the callback function ever be invoked in such a way as to cause the lock(obj) to not be released before operation.Perform() is called? i.e., is there a way that the callback function can be invoked on the same thread that is holding the lock, and before that thread has released the lock?
EDIT: messagingBus.Post(...) can be assumed to be an insert on to a queue, that then returns immediately. The callback is invoked on some other thread, probably from the thread pool.
For the operation.Perform() you can read it as Thread.Sleep(10000) - just something that runs for a long time and doesn't share or mutate any state.
I'm going to guess.
Post in .net generally implies that the work will be done by another thread or at another time.
So yes, it's not only possible that the lock on obj will be released before Perform is called, it's fairly likely it will happen. However, it's not guaranteed. Perform may complete before the lock is released.
That doesn't mean it's a problem. The "perform various actions" part may need the lock. messagingBus may need the lock to queue the action. The work inside may not need the lock at all, in which case the code is thread safe.
This is all a guess because there's no notion of what work is being done, why it must be inside a lock, and what Post or perform does. So the code may be perfectly safe, or it may be horribly flawed.
Without know what messagingBus.Post is doing, you can't tell. If Post invokes the delegate it is given (the lambda expression in your example) then the lock will be in place while that lambda executes. If Post schedules that delegate for execution at a later time, then the lock will not be in place while the lambda executes. It's not clear what the the lock(obj) is for, to lock calls to messagingBus.Post, or what... Detailing the type (including full namespace) of the messagingBus variable would go a long way to providing better details.
If the callback executes asynchronously, then yes, the lock may still be held when Perform() unless Post() does something specific to avoid that case (which would be unusual).
If the callback was scheduled on the same thread as the call to Post() (e. g. in the extreme example where the thread pool has only 1 thread), a typical thread pool implementation would not execute the callback until the thread finishes it's current task, which in this case would require it releasing the lock before executing Perform().
It's impossible to answer your question without knowing how messagingBus.Post is implemented. Async APIs typically provide no guarantee that the callback will be executed truly concurrently. For example, .Net APM methods such as FileStream.BeginRead may decide to perform the operation synchronously, in wich case the callback will be executed on the same thread that called BeginRead. Returned IAsyncResult.CompletedSynchronously will be set to true in this case.

System.Timers.Timer need to get back to 'Main' thread

I have a Class(call it 'Foo') that has a System.Timers.Timer(call it 'myTimer').
Foo wraps unmanaged non thread-safe code.
On myTimer.Elapsed I need to use Methods in Foo.
Right now myTimer is trying use Foo's Methods on worker threads and this isn't working.
I need to get back to the thread that contains Foo' Methods.
How do I accomplish this?
FYI, Foo's Methods are in a non-UI thread.
If Foo is not a UI thread and you need to invoke code to execute on it from a different thread you will need to capture the synchronization context which executes Foo and then invoke your code from the timer on it.
Take a look at the SynchronizationContext class. You can either Post (asynchronously) or Send a delegate to be executed on a specific, previously captured, synchronization context.
You may also want/need to look into the ExecutionContext class. In particular: ExecutionContext.Capture and ExecutionContext.Run

Is it possible to give callback for threads created using Thread class

Is it possible to give callback(announcing completion of activity) for threads created using Thread class. I have created the thread the following way but could not find a way to give the callback.
Thread thread = new Thread(StartPoll);
thread.SetApartmentState(ApartmentState.STA);
thread.Start();
Not directly. But you can always do something like:
new Thread(() => { StartPoll(); Callback(); })
Setting the apartment state to STA is not enough, the second requirement is that you must pump a message loop. Application.Run() in either Winforms or WPF. It is actually the message loop that permits marshaling a call to a specific thread. Implemented by respectively Control.Begin/Invoke and Dispatcher.Begin/Invoke().
That's however more of a UI implementation detail. The generic solution is very similar to what the UI thread does, you use a thread-safe queue and a loop in the "main" thread to read objects from the queue. Like a delegate you can invoke. A standard solution to the producer/consumer problem. The .NET 4 BlockingCollection class makes it easy. Rewriting the thread code so it loops and stays responsive to worker requests is not always so easy.

General Threading Questions

I'm kind of new to threading in C# and had a couple of questions about what is out there:
What are the ways to implement threads in C#? (i.e. I can think of two off the top: backgroundWorker, Thread, etc)
How do you cause deadlock and if there is deadlock how do you get out of it (in C#)?
How does backgroundworker get implemented? It seems to have an underlying set of methods, but I'd like to know what those methods and instantiations are...
Thanks!
The definitive beginner's guide to threading in C# is here:
http://www.albahari.com/threading/
The documentation on BackgroundWorker, with a complete working example, is here: http://msdn.microsoft.com/en-us/library/system.componentmodel.backgroundworker.aspx
Deadlocks are explained here: http://www.albahari.com/threading/part2.aspx
Threads can be implemented in many ways. You can use them directly, pull them from a ThreadPool, or use them indirectly using the Task Parallel Library.
What are the ways to implement threads
in C#?
There are various different ways to take advantage of threading; some involving the explicit creation of threads while others take advantage of already running threads.
The Thread class itself.
Queue a work item in the thread pool.
Use the BackgroundWorker class.
Use the Task Parallel Library (TPL).
Use Parallel LINQ.
Use asynchronous delegates.
Use timers like System.Threading.Timer and System.Timers.Timer.
How do you cause deadlock and if there is deadlock how do you get out
of it (in C#)?
Here are 3 different ways you can cause a deadlock. This list is not exhaustive.
Call a blocking method from within a lock section.
In this example thread A acquires a lock and then immediately calls a blocking method while at the same time thread B attempts to acquire the same lock, but gets hung because thread A is waiting for thread B to signal the event before it will release the lock.
public class Example
{
ManualResetEvent m_Event = new ManualResetEvent(false);
void ThreadA()
{
lock (this)
{
m_Event.WaitOne();
}
}
void ThreadB()
{
lock (this)
{
m_Event.Set();
}
}
}
Acquire two locks out of order.
No explanation is needed here since this is a well known problem.
public class Example
{
private object m_LockObjectA = new object();
private object m_LockObjectB = new Object();
void ThreadA()
{
lock (m_LockObjectA) lock (m_LockObjectB) { }
}
void ThreadB()
{
lock (m_LockObjectB) lock (m_LockObjectA) { }
}
}
The lock-free deadlock.
This is one my favorite illustrations of a deadlock because no lock or blocking method is involved. The subtlety of the problem is enough to confound even those who are familiar with threading. The issue here is related to the absence of memory barriers. Thread A waits for thread B to set the signal flag while at the same time thread B waits for thread A to reset it, all the while neither thread is seeing the changes the other is making because the compiler, JIT, and hardware are free to optimize the reads and writes of the flag in manner that is non-intuitive.
public class Example
{
private bool m_Signal = false;
void ThreadA()
{
while (!m_Signal);
m_Signal = false;
}
void ThreadB()
{
m_Signal = true;
while (m_Signal);
}
}
How does backgroundworker get
implemented?
Here is a very simple step-by-step procedure to get you started.
Add an event handler that performs the actual work to the DoWork event.
Add an event handler to receive progress information to the ProgressChanged event.
Add an event handler that will be executed upon completion to the RunWorkerCompleted event.
Call RunWorkerAsync from the UI thread to start the background operation. This raises the DoWork event on a separate thread.
Call ReportProgress periodically from the DoWork event handler to publish new progress information. This raises the ProgressChanged event on the UI thread.
.net 4 offers parallel LINQ. This is very nice if you want to parallelize a side-effect free calculation which is easily expressible in functional/linq style.
For all common uses and purposes, use Thread. If you want to communicate from some thread to GUI, you may think of using BackgroundWorker, because it will automatically serialize (with Invoke() ) calls to GUI methods so you won't have GUI locking issues.
And as the deadlocks are concerned, don't worry about them. Deadlocks are possible only if you have 2 threads competing for the same set of resources, and I guess you won't tackle that just yet.
I would classify the answer into 3 sections. So with .net 4.0 all the examples above fall under 3 major categories:
1. Threads managed by .net thread pool (asynchronous delegate invocation, backgroundworker, etc)
2. The thread class - you have to manage the lifetime of the thread yourself
and finally Parallel Linq which requires multi core CPU.

Categories

Resources