I have a custom collection (a thread-safe ObservableQueue). I implemented the business logic inside the collection class (i.e. dequeue the items one by one and expose them to the outside). This is working fine. To prevent the collection from blocking the thread it is initialised in, the OnservableQueue implements a thread to perform that work. Now I am not perfectly sure of any pitfalls that could occur.
Is it a bad idea to initialise (not start! only initialise) the thread in the constructor? And what would be a good, if not best, practice of terminating the thread? Note, I dont need to know how to terminate a thread, that is working fine, I am rather interested in weather there is something wrong doing it using the disposable pattern or creating a method which would need to get called to terminate the thread. If implementing IDisposable are there any things I have to take in account regarding the collection/queue?
Edit: The thread is actually only pre-initialised to prevent NullReferenceException from being thrown in the Enqueue Method, where it is properly initilised again (the Enqueue Method is supposed to check weather a dequeuing thread is running already and if not to start a new one). Note that whenever all items are dequeued and the thread has done its work it will not be alive any longer either, so any time the queue is empty and a new item is added a new thread will get started to process the queue:
if (!_dequeuingThread.IsAlive)
{
// start the dequeuing thread
_dequeuingThread = new Thread(new ThreadStart(StartDequeuing));
_dequeuingThread.Name = "DeQueueThread";
_dequeuingThread.Start();
}
The if-statement does need an initialised thread. There are other possible ways of achieving this, but pre-initialising the thread seemed the least bothersome. You see that after checking weather the thread is alive, which it should not when being pre-initialised, it gets initialised again properly.
I don't see anything wrong with initialising in the constructor, but obviously bare in mind they will be initialised in a different thread than your worker thread.
As for stopping, I generally have a volatile boolean flag that the worker checks to keep running. If your worker thread sleeps at all, then have it wait on an event rather than sleeping, so you can wake it up immediately when stopping it.
There seems to be a problem with the fact that the consumer will initialize this collection object by calling its constructor and it would think that the object is initialized (that what the constructor is supposed to do), which is not correct as the initialization is happening on a separate thread created by the constructor. So, basically you need to implement some sort of "Asynchronous API on this object" to initialize this collection such that the consumer calls the initialize method (after creating the object using constructor) and then either by either passing a callback to the initialize method or by registering to an event on the collection object the consumer gets to know that the initialization has been completed.
Related
I was reading AutoResetEvent documentation on MSDN and following warning kinda bothers me..
"Important:
There is no guarantee that every call to the Set method will release a thread. If two calls are too close together, so that the second call occurs before a thread has been released, only one thread is released. It is as if the second call did not happen. Also, if Set is called when there are no threads waiting and the AutoResetEvent is already signaled, the call has no effect."
But this warning basically kills the very reason to have such a thread synchronization techniques. For example I have a list which will hold jobs. And there is only one producer which will add jobs to the list. I have consumers (more than one), waiting to get the job from the list.. something like this..
Producer:
void AddJob(Job j)
{
lock(qLock)
{
jobQ.Enqueue(j);
}
newJobEvent.Set(); // newJobEvent is AutoResetEvent
}
Consumer
void Run()
{
while(canRun)
{
newJobEvent.WaitOne();
IJob job = null;
lock(qLock)
{
job = jobQ.Dequeue();
}
// process job
}
}
If the above warning is true, then if I enqueue two jobs very quickly, only one thread will pick up the job, isn't it? I was under the assumption that Set will be atomic, that is it does the following:
Set the event
If threads are waiting, pick one thread to wake up
reset the event
run the selected thread.
So I am basically confused about the warning in MSDN. is it a valid warning?
Even if the warning isn't true and Set is atomic, why would you use an AutoResetEvent here? Let's say you have some producers queue up 3 events in row and there's one consumer. After processing the 2nd job, the consumer blocks and never processes the third.
I would use a ReaderWriterLockSlim for this type of synchronization. Basically, you need multiple producers to be able to have write locks, but you don't want consumers to lock out producers for a long time while they are only reading the queue size.
The message on MSDN is a valid message indeed. What's happening internally is something like this:
Thread A waits for the event
Thread B sets the event
[If thread A is in spinlock]
[yes] Thread a detects that the event is set, unsets it and resumes its work
[no] The event will tell thread A to wake up, once woken, thread A will unset the event resume its work.
Note that the internal logic is not synchronous since Thread B doesn't wait for Thread A to continue its business. You can make this synchronous by introducing a temporary ManualResetEvent that thread A has to signal once it continues its work and on which Thread B has to wait. This is not done by default due to the inner working of the windows threading model. I guess the documentation is misleading but correct for saying that the Set method only releases one or more waiting threads.
Alternatively i would suggest you to look at the BlockingCollection class in the System.Collections.Concurrent namespace of the BCL introduced in .NET 4.0 which does exactly what you are trying to do
There are a hundred examples in blogs, etc. on how to implement a background worker that logs or gives status to a foreground GUI element. Most of them include an approach to handle the race condition that exists between spawning the worker thread, and creating the foreground dialog with ShowDialog(). However, it occured to me that a simple approach is to force the creation of the handle in the form constructor, so that the thread won't be able to trigger an Invoke/BeginInvoke call on the form prior to its handle creation.
Consider a simple example of a Logger class that uses a background worker thread to log to the foreground.
Assume, also, that we don't want NLog or some other heavy framework to do something so simple and lightweight.
My logger window is opened with ShowDialog() by the foreground thread, but only after the background "worker" thread is started. The worker thread calls logger.Log() which itself uses logForm.BeginInvoke() to update the log control correctly on the foreground thread.
public override void Log(string s)
{
form.BeginInvoke(logDelegate, s);
}
Where logDelegate is just a simple wrapper around "form.Log()" or some other code that may update a progress bar.
The problem lies in the race condition that exists; when the background worker thread starts logging before the foreground ShowDialog() is called the form's Handle hasn't yet been created so the BeginInvoke() call fails.
I'm familiar with the various approaches, including using a Form OnLoad event and a timer to create the worker task suspended until the OnLoad event generates a timer message that starts the task once the form is shown, or, as mentioned, using a queue for the messages. However, I think that simply forcing the dialog's handle to create early (in the constructor) ensures there is no race condition, assuming the thread is spawned off by the same thread that creates the dialog.
http://msdn.microsoft.com/en-us/library/system.windows.forms.control.handle(v=vs.71).aspx
MSDN says: "If the handle has not yet been created, referencing this property will force the handle to be created."
So my logger wraps a form, and its constructor does:
public SimpleProgressDialog() {
var h = form.Handle; // dereference the handle
}
The solution seems too simple to be correct. I'm specifically interested in why the seemingly too simple solution is or isn't safe to use.
Any comments? Am I missing something else?
EDIT: I'm NOT asking for alternatives. Not asking how to use NLog or Log4net, etc. if I were, I'd write a page about all of the customer constraints on this app, etc.
By the number of upvotes, there are a lot of other people that would like to know the answer too.
If you are concerned that referencing Control.Handle relies on a side effect in order to create the handle, you can simply call Control.CreateControl() to create it. However, referencing the property has the benefit of not initializing it if it already exists.
As for whether this is safe or not assuming the handle is created, you are correct: as long as you create the handle before spawning the background task on the same thread, you will avoid a race condition.
My two cents: there's no real need to force early handle creation if the logging framework simply maintains a buffer of undisplayed log entries while the handle has not been created. It could be implemented as a Queue, or many other things. Messing with the order of handle creation in .NET makes me squeamish.
I think the only danger is decreased performance. Handle creation is deferred in winforms to speed things up. However, since it sound like this is a one-time operation, it doesn't sound costly, so I think your approach is fine.
You can always check the IsHandleCreated property of your form to see if the handle has been built yet; however, there are some caveats. I've been in a similar spot to yours, where winforms controls are being created/destroyed dynamically with lots of multithreading going on. The pattern we wound up using was quite a bit like this:
private void SomeEventHandler(object sender, EventArgs e) // called from a bg thread
{
MethodInvoker ivk = delegate
{
if(this.IsDisposed)
return; // bail out! Run away!
// maybe look for queued stuff if it exists?
// the code to run on the UI thread
};
if(this.IsDisposed)
return; // run away! killer rabbits with pointy teeth!
if(!this.IsHandleCreated) // handle not built yet, do something in the meantime
DoSomethingToQueueTheCall(ivk);
else
this.BeginInvoke(ivk);
}
The big lesson here is to expect a kaboom if you attempt to interact with your form after it has been disposed. Don't rely on InvokeRequired, since it will return false on any thread if the control's handle hasn't been created yet. Also don't rely solely on IsHandleCreated since that will return false after the control has been disposed.
Basically, you have three flags whose state will tell you what you need to know about the control's initialization state and whether or not you're on a BG thread relative to the control.
The control can be in one of three initialization states:
Uninitialized, no handle created yet
InvokeRequired returns false on every thread
IsHandleCreated returns false
IsDisposed returns false
Initialized, ready, active
InvokeRequired does what the docs say
IsHandleCreated returns true
IsDisposed returns false
Disposed
InvokeRequired returns false on every thread
IsHandleCreated returns false
IsDisposed returns true
Hope this helps.
Since you do create the window on the calling thread you can end up with deadlocks. If the thread that creates the window has no message pump running your BeginInvoke will add your delegate call to the message queue which will never be emptied, if you do not have an Application.Run() on the same thread which will process the window messages.
It is also very slow to send around window messages for each log message. It is much better to have a producer consumer model where your logging thread adds a message to a Queue<string> which is emptied by another thread. The only time you need to lock is when you enqueue or dequeue a message. The consumer thread can wait for an event with a timeout to start processing the next message when either the event was signaled or the timeout (e.g. 100ms) has elapsed.
A thread safe blocking Queue can be found here.
I have a producer-consumer scenario in ASP.NET. I designed a Producer class, a Consumer class and a class for holding the shared objects and responsible for communication between Producer and Consumer, lets call it Mediator. Because I fork the execution path at start-up (in parent object) and one thread would call Producer.Start() and another thread calls Consumer.Start(), I need to pass a reference of Mediator to both Producer and Consumer (via Constructor). Mediator is a smart class which will optimize many things like length of it's inner queue but for now consider it as a circular blocking queue. Producer would enqueues new objects to Mediator until the queue gets full and then Producer would block. Consumer dequeues objects from Mediator until there's nothing in the queue. For signaling between threads, I implemented two methods in Mediator class: Wait() and Pulse(). The code is something like this:
Class Mediator
{
private object _locker = new object();
public void Wait()
{
lock(_locker)
Monitor.Wait(_locker);
}
public void Pulse()
{
lock(_locker)
Monitor.Pulse(_locker);
}
}
// This way threads are signaling:
Class Consumer
{
object x;
if (Mediator.TryDequeue(out x))
// Do something
else
Mediator.Wait();
}
Inside Mediator I use this.Pulse() every time something is Enqueued or Dequeued so waiting threads would be signaled and continue their work.
But I encounter deadlocks and because I have never used this kind of design for signaling threads, I'm not sure if something is wrong with the design or I'm doing something wrong elsewhere ?
Thanks
There is not much code here to go on, but my best guess is that you have a live-lock problem. If Mediator.Pulse is called before Mediator.Wait then the signal gets lost even though there is something in the queue. Here is the standard pattern for implementing the blocking queue.
public class BlockingQueue<T>
{
private Queue<T> m_Queue = new Queue<T>();
public void Enqueue(T item)
{
lock (m_Queue)
{
m_Queue.Enqueue(item);
Monitor.Pulse(m_Queue);
}
}
public T Dequeue()
{
lock (m_Queue)
{
while (m_Queue.Count == 0)
{
Monitor.Wait(m_Queue);
}
return m_Queue.Dequeue();
}
}
}
Notice how Monitor.Wait is only called when the queue is empty. Also notice how it is being called in a while loop. This is because a Wait does not have priority over a Enter so a new thread coming into Dequeue could take the last item even though a call to Wait is ready to return. Without the loop a thread could attempt to remove an item from an empty queue.
If you can use .NET 4 your best bet would be to use BlockingCollection<T> (http://msdn.microsoft.com/en-us/library/dd267312.aspx) which handles queueing, dequeuing, and limits on queue length.
Nothing is wrong with design.
Problem raises when you use Monitor.Wait() and Monitor.Pulse() when you don't know which thread is going to do it's job first (producer or consumer). In that case using an AutoResetEvent resolves the problem. Think of consumer when it reaches the section where it should consume the data produced by producer. Maybe it reaches there before producer pulse it, then everything is OK but what if consumer reaches there after producer has signaled. Yes, then you encounter a deadlock because producer already called Monitor.Pulse() for that section and would not repeat it.
Using AutoResetEvent you sure consumer waits there for signal from producer and if producer already has signaled before consumer even reaches the section, the gate is open and consumer would continue.
It's OK to use Monitor.Wait() and Monitor.Pulse() inside Mediator for signaling waiting threads.
Is it possible that the deadlock is occurring because Pulse doesn't store any state? This means that if the Producer calls Pulse before/after Consumer calls Wait, then the Wait will block. This is the note in the documentation for Monitor.Pulse
Also, you should know that object x = new object(); is extraneous - an out call will initialize x, so the object created will fall out of scope with the TryDequeue call.
Difficult to tell with the code sample supplied.
Is the lock held elsewhere? Within Mediator?
Are the threads just parked on obtaining the lock and not on the actual Wait call?
Have you paused the threads in a debugger to see what the current state is?
Have you tried a simple test with just putting a simple single value on a queue and getting it to work? Or is Mediator pretty complex at this point?
Until a little more detail is available in the Mediator class and your producer class, it's some wild guessing. It seems like some thread may be holding the lock when you don't expect it to. Once you pulse, you do need to free the lock in whatever thread may have it by exiting the "lock" scope. So, if somewhere in Mediator you have the lock and then call Pulse, you need to exit the outer most scope where the lock is held and not just the one in Pulse.
Can you refactor to a normal consumer/ producer queue? That could then handle enqueing and dequing and thread-signalling in a single class, so no need to pass around public locks. Dequeing process could then be handled via a delegate. I can post an example if you wish.
I asked the question below couple of weeks ago. Now, when reviewing my question and all the answers, a very important detail jumped into my eyes: In my second code example, isn't DoTheCodeThatNeedsToRunAsynchronously() executed in the main (UI) thread? Doesn't the timer just wait a second and then post an event to the main thread? This would mean then that the code-that-needs-to-run-asynchronously isn't run asynchronously at all?!
Original question:
I have recently faced a problem multiple times and solved it in different ways, always being uncertain on whether it is thread safe or not: I need to execute a piece of C# code asynchronously. (Edit: I forgot to mention I'm using .NET 3.5!)
That piece of code works on an object that is provided by the main thread code. (Edit: Let's assume that object is thread-safe in itself.) I'll present you two ways I tried (simplified) and have these four questions:
What is the best way to achieve what I want? Is it one of the two or another approach?
Is one of the two ways not thread-safe (I fear both...) and why?
The first approach creates a thread and passes it the object in the constructor. Is that how I'm supposed to pass the object?
The second approach uses a timer which doesn't provide that possibility, so I just use the local variable in the anonymous delegate. Is that safe or is it possible in theory that the reference in the variable changes before it is evaluated by the delegate code? (This is a very generic question whenever one uses anonymous delegates). In Java you are forced to declare the local variable as final (i.e. it cannot be changed once assigned). In C# there is no such possibility, is there?
Approach 1: Thread
new Thread(new ParameterizedThreadStart(
delegate(object parameter)
{
Thread.Sleep(1000); // wait a second (for a specific reason)
MyObject myObject = (MyObject)parameter;
DoTheCodeThatNeedsToRunAsynchronously();
myObject.ChangeSomeProperty();
})).Start(this.MyObject);
There is one problem I had with this approach: My main thread might crash, but the process still persists in the memory due to the zombie thread.
Approach 2: Timer
MyObject myObject = this.MyObject;
System.Timers.Timer timer = new System.Timers.Timer();
timer.Interval = 1000;
timer.AutoReset = false; // i.e. only run the timer once.
timer.Elapsed += new System.Timers.ElapsedEventHandler(
delegate(object sender, System.Timers.ElapsedEventArgs e)
{
DoTheCodeThatNeedsToRunAsynchronously();
myObject.ChangeSomeProperty();
});
DoSomeStuff();
myObject = that.MyObject; // hypothetical second assignment.
The local variable myObject is what I'm talking about in question 4. I've added a second assignment as an example. Imagine the timer elapses after the second assigment, will the delegate code operate on this.MyObject or that.MyObject?
Whether or not either of these pieces of code is safe has to do with the structure of MyObject instances. In both cases you are sharing the myObject variable between the foreground and background threads. There is nothing stopping the foreground thread from modifying myObject while the background thread is running.
This may or may not be safe and depends on the structure of MyObject. However if you haven't specifically planned for it then it's most certainly an unsafe operation.
I recommend using Task objects, and restructuring the code so that the background task returns its calculated value rather than changing some shared state.
I have a blog entry that discusses five different approaches to background tasks (Task, BackgroundWorker, Delegate.BeginInvoke, ThreadPool.QueueUserWorkItem, and Thread), with the pros and cons of each.
To answer your questions specifically:
What is the best way to achieve what I want? Is it one of the two or another approach? The best solution is to use the Task object instead of a specific Thread or timer callback. See my blog post for all the reasons why, but in summary: Task supports returning a result, callbacks on completion, proper error handling, and integration with the universal cancellation system in .NET.
Is one of the two ways not thread-safe (I fear both...) and why? As others have stated, this totally depends on whether MyObject.ChangeSomeProperty is threadsafe. When dealing with asynchronous systems, it's easier to reason about threadsafety when each asynchronous operation does not change shared state, and rather returns a result.
The first approach creates a thread and passes it the object in the constructor. Is that how I'm supposed to pass the object? Personally, I prefer using lambda binding, which is more type-safe (no casting necessary).
The second approach uses a timer which doesn't provide that possibility, so I just use the local variable in the anonymous delegate. Is that safe or is it possible in theory that the reference in the variable changes before it is evaluated by the delegate code? Lambdas (and delegate expressions) bind to variables, not to values, so the answer is yes: the reference may change before it is used by the delegate. If the reference may change, then the usual solution is to create a separate local variable that is only used by the lambda expression,
as such:
MyObject myObject = this.MyObject;
...
timer.AutoReset = false; // i.e. only run the timer once.
var localMyObject = myObject; // copy for lambda
timer.Elapsed += new System.Timers.ElapsedEventHandler(
delegate(object sender, System.Timers.ElapsedEventArgs e)
{
DoTheCodeThatNeedsToRunAsynchronously();
localMyObject.ChangeSomeProperty();
});
// Now myObject can change without affecting timer.Elapsed
Tools like ReSharper will try to detect whether local variables bound in lambdas may change, and will warn you if it detects this situation.
My recommended solution (using Task) would look something like this:
var ui = TaskScheduler.FromCurrentSynchronizationContext();
var localMyObject = this.myObject;
Task.Factory.StartNew(() =>
{
// Run asynchronously on a ThreadPool thread.
Thread.Sleep(1000); // TODO: review if you *really* need this
return DoTheCodeThatNeedsToRunAsynchronously();
}).ContinueWith(task =>
{
// Run on the UI thread when the ThreadPool thread returns a result.
if (task.IsFaulted)
{
// Do some error handling with task.Exception
}
else
{
localMyObject.ChangeSomeProperty(task.Result);
}
}, ui);
Note that since the UI thread is the one calling MyObject.ChangeSomeProperty, that method doesn't have to be threadsafe. Of course, DoTheCodeThatNeedsToRunAsynchronously still does need to be threadsafe.
"Thread-safe" is a tricky beast. With both of your approches, the problem is that the "MyObject" your thread is using may be modified/read by multiple threads in a way that makes the state appear inconsistent, or makes your thread behave in a way inconsistent with actual state.
For example, say your MyObject.ChangeSomeproperty() MUST be called before MyObject.DoSomethingElse(), or it throws. With either of your approaches, there is nothing to stop any other thread from calling DoSomethingElse() before the thread that will call ChangeSomeProperty() finishes.
Or, if ChangeSomeProperty() happens to be called by two threads, and it (internally) changes state, the thread context switch may happen while the first thread is in the middle of it's work and the end result is that the actual new state after both threads is "wrong".
However, by itself, neither of your approaches is inherently thread-unsafe, they just need to make sure that changing state is serialized and that accessing state is always giving a consistent result.
Personally, I wouldn't use the second approach. If you're having problems with "zombie" threads, set IsBackground to true on the thread.
Your first attempt is pretty good, but the thread continued to exist even after the application exits, because you didn't set the IsBackground property to true... here is a simplified (and improved) version of your code:
MyObject myObject = this.MyObject;
Thread t = new Thread(()=>
{
Thread.Sleep(1000); // wait a second (for a specific reason)
DoTheCodeThatNeedsToRunAsynchronously();
myObject.ChangeSomeProperty();
});
t.IsBackground = true;
t.Start();
With regards to the thread safety: it's difficult to tell if your program functions correctly when multiple threads execute simultaneously, because you're not showing us any points of contention in your example. It's very possible that you will experience concurrency issues if your program has contention on MyObject.
Java has the final keyword and C# has a corresponding keyword called readonly, but neither final nor readonly ensure that the state of the object you're modifying will be consistent between threads. The only thing these keywords do is ensure that you do not change the reference the object is pointing to. If two threads have read/write contention on the same object, then you should perform some type of synchronization or atomic operations on that object in order to ensure thread safety.
Update
OK, if you modify the reference to which myObject is pointing to, then your contention is now on myObject. I'm sure that my answer will not match your actual situation 100%, but given the example code you've provided I can tell you what will happen:
You will not be guaranteed which object gets modified: it can be that.MyObject or this.MyObject. That's true regardless if you're working with Java or C#. The scheduler may schedule your thread/timer to be executed before, after or during the second assignment. If you're counting on a specific order of execution, then you have to do something to ensure that order of execution. Usually that something is a communication between the threads in the form of a signal: a ManualResetEvent, Join or something else.
Here is a join example:
MyObject myObject = this.MyObject;
Thread task = new Thread(()=>
{
Thread.Sleep(1000); // wait a second (for a specific reason)
DoTheCodeThatNeedsToRunAsynchronously();
myObject.ChangeSomeProperty();
});
task.IsBackground = true;
task.Start();
task.Join(); // blocks the main thread until the task thread is finished
myObject = that.MyObject; // the assignment will happen after the task is complete
Here is a ManualResetEvent example:
ManualResetEvent done = new ManualResetEvent(false);
MyObject myObject = this.MyObject;
Thread task = new Thread(()=>
{
Thread.Sleep(1000); // wait a second (for a specific reason)
DoTheCodeThatNeedsToRunAsynchronously();
myObject.ChangeSomeProperty();
done.Set();
});
task.IsBackground = true;
task.Start();
done.WaitOne(); // blocks the main thread until the task thread signals it's done
myObject = that.MyObject; // the assignment will happen after the task is done
Of course, in this case it's pointless to even spawn multiple threads, since you're not going to allow them to run concurrently. One way to avoid this is by not changing the reference to myObject after you've started the thread, then you won't need to Join or WaitOne on the ManualResetEvent.
So this leads me to a question: why are you assigning a new object to myObject? Is this a part of a for-loop which is starting multiple threads to perform multiple asynchronous tasks?
What is the best way to achieve what I want? Is it one of the two or another approach?
Both look fine, but...
Is one of the two ways not thread-safe (I fear both...) and why?
...they are not thread safe unless MyObject.ChangeSomeProperty() is thread safe.
The first approach creates a thread and passes it the object in the constructor. Is that how I'm supposed to pass the object?
Yes. Using a closure (as in your second approach) is fine as well, with the additional advantage that you don't need to do a cast.
The second approach uses a timer which doesn't provide that possibility, so I just use the local variable in the anonymous delegate. Is that safe or is it possible in theory that the reference in the variable changes before it is evaluated by the delegate code? (This is a very generic question whenever one uses anonymous delegates).
Sure, if you add myObject = null; directly after setting timer.Elapsed, then the code in your thread will fail. But why would you want to do that? Note that changing this.MyObject will not affect the variable captured in your thread.
So, how to make this thread-safe? The problem is that myObject.ChangeSomeProperty(); might run in parallel with some other code that modifies the state of myObject. There are basically two solutions to that:
Option 1: Execute myObject.ChangeSomeProperty() in the main UI thead. This is the simplest solution if ChangeSomeProperty is fast. You can use the Dispatcher (WPF) or Control.Invoke (WinForms) to jump back to the UI thread, but the easiest way is to use a BackgroundWorker:
MyObject myObject = this.MyObject;
var bw = new BackgroundWorker();
bw.DoWork += (sender, args) => {
// this will happen in a separate thread
Thread.Sleep(1000);
DoTheCodeThatNeedsToRunAsynchronously();
}
bw.RunWorkerCompleted += (sender, args) => {
// We are back in the UI thread here.
if (args.Error != null) // if an exception occurred during DoWork,
MessageBox.Show(args.Error.ToString()); // do your error handling here
else
myObject.ChangeSomeProperty();
}
bw.RunWorkerAsync(); // start the background worker
Option 2: Make the code in ChangeSomeProperty() thread-safe by using the lock keyword (inside ChangeSomeProperty as well as inside any other method modifying or reading the same backing field).
The bigger thread-safety concern here, in my mind, may be the 1 second Sleep. If this is required in order to synchronize with some other operation (giving it time to complete), then I strongly recommend using a proper synchronization pattern rather than relying on the Sleep. Monitor.Pulse or AutoResetEvent are two common ways to achieve synchronization. Both should be used carefully, as it's easy to introduce subtle race conditions. However, using Sleep for synchronization is a race condition waiting to happen.
Also, if you want to use a thread (and don't have access to the Task Parallel Library in .NET 4.0), then ThreadPool.QueueUserWorkItem is preferable for short-running tasks. The thread pool threads also won't hang up the application if it dies, as long as there is not some deadlock preventing a non-background thread from dying.
One thing not mentioned so far: The choice of threading methods depends heavily on specifically what DoTheCodeThatNeedsToRunAsynchronously() does.
Different .NET threading approaches are suitable for different requirements. One very large concern is whether this method will complete quickly, or take some time (is it short-lived or long-running?).
Some .NET threading mechanisms, like ThreadPool.QueueUserWorkItem(), are for use by short-lived threads. They avoid the expense of creating a thread by using "recycled" threads--but the number of threads it will recycle is limited, so a long-running task shouldn't hog the ThreadPool's threads.
Other options to consider are using:
ThreadPool.QueueUserWorkItem() is a convienient means to fire-and-forget small tasks on a ThreadPool thread
System.Threading.Tasks.Task is a new feature in .NET 4 which makes small tasks easy to run in async/parallel mode.
Delegate.BeginInvoke() and Delegate.EndInvoke() (BeginInvoke() will run the code asynchronously, but it's crucial that you ensure EndInvoke() is called as well to avoid potential resource-leaks. It's also based on ThreadPool threads I believe.
System.Threading.Thread as shown in your example. Threads provide the most control but are also more expensive than the other methods--so they are ideal for long-running tasks or detail-oriented multithreading.
Overall my personal preference has been to use Delegate.BeginInvoke()/EndInvoke() -- it seems to strike a good balance between control and ease of use.
I have a list of objects,
for each object i want to run a totally separate thread (thread safty),like....i will pick one a object from my list in while loop and run a thread and then for next object run the next threads...all thread should be synchronized such that resources (values/connection (close/open) )shared by them should not change.....
Starting a thread per object is not necessarily wise; you should probably have a small number of worker threads picking items off the list (or better, a Queue<T>), synchronizing access to that list/queue. An example of a thread-safe queue can be found in this thread.
Once you have a work item, there is no magic bullet for making the rest of the code you write (to process it) thread-safe. A sensible approach that keeps things simple is immutability - either true immutability (the items can't change), or simply don't change the object. You can of course implement locking around the work item, but this only helps if all your code uses the same locking strategy, which is hard to enforce.
i will pick one a object from my list
in while loop and run a thread and
then for next object run the next
threads
If I really wanted a thread per object, which I probably wouldn't, I would create a class like this:
class ObjectProcessingThread
{
Thread processingThread = new Thread();
public TargetObject { get; set;}
public Start()
{
//start the processing thread with threadEntryPoint as the work the thread will do
}
private threadEntryPoint
{
//do stuff with targetObject
}
}
Then in the while loop new up an ObjectProcessingThread for each object, setting it's TargetObject property, then calling Start.
all thread should be synchronized such
that resources (values/connection
(close/open) )shared by them should
not change.....
If you don't want values to change, don't change them.