I have a windows service that is designed to handle incoming data, process it, and alert users if necessary. One thing that I am having trouble figuring out is how to keep a thread alive.
I have a few classes that share a ConcurrentBag of Device objects. The DeviceManager class is tasked with populating this collection and updating the device objects if a parameter about a device changes in the database. So for example, in the database someone updates device 23 to have a normal high of 50F. The DeviceManager would update the appropriate device in memory to have this new value.
Oracle provides an event handler to be notified when a table changes (docs here). I want to attach an event handler so I can be notified when to update my devices in memory. The problem is, how can I create a thread for my DeviceManager to work in and for it to just idle in the thread until the event occurs and is handled there? I would like to have the event fire and be handled in this thread instead of the main one.
You can create a separate worker thread when your service starts up. The worker thread will connect to the database and listen for change notifications, and update your ConcurrentBag accordingly. When the service is shut down, you can gracefully terminate the thread.
MSDN has an example that I think will help you: How to: Create and Terminate Threads
There are a large number of synchronization techniques available in .NET, and to discuss the entire scope would be too broad to address here. However, you should look at the Monitor class, with its Wait() and Pulse() methods.
For example:
private readonly object _lockObj = new object();
public void StartThread()
{
new Thread(ThreadProc).Start();
}
public void SignalThread()
{
lock (_lockObj)
{
// Initialize some data that the thread will use here...
// Then signal the thread
Monitor.Pulse(_lockObj);
}
}
private void ThreadProc()
{
lock (_lockObj)
{
// Wait for the signal
Monitor.Wait(_lockObj);
// Here, use data initialized by the other thread
}
}
Of course you can put the thread's locking/waiting code in a loop if you need for the thread to repeat the operation.
It looks like there's no shortage of other questions involving the Monitor class on SO:
https://stackoverflow.com/search?q=%5Bc%23%5D+monitor+pulse+wait
And of course, the documentation on MSDN has other examples as well.
Related
I am considering creating an asynchronous logging component having a dedicated thread that will read new items from queue and write to database, file, etc. If I create a thread as a background one - it will be terminated as soon as the process ends thus all items in queue will be lost. If I create it is a foreground one - I will have to figure out when to stop it as it will prevent the application from closing. Is there any way not to make developers remember to 'stop' logging functionality before application exits?
I believe you can:
Subscribe to the AppDomain.ProcessExit event;
Use a Volatile sentinel variable as a shutdown flag;
Set the flag when the ProcessExit event fires up;
Monitor the state of the flag inside your thread, and gracefully shut down accordingly.
This way you may keep a foreground thread aware of impending doom.
First of all I have to agree with the comments above. I would just use something like NLog rather than trying to roll my own. While it may seem like there is a lot to learn at first, it is still better than writing and debugging your own.
If you really want to travel this road, my recommendation would be to use a 'using' statement and IDisposable to control the asynchronous behavior. Just start a normal thread in the ctor and signal & Join the thread on Dispose().
Example usage:
void Main()
{
using (new Logging())
{
...
}
}
Example class (untested):
class Logging :IDisposable
{
ManualResetEvent _stop = new ManualResetEvent(false);
Thread _worker = null;
public Logging()
{
_worker = new Thread(AsyncThread);
_worker.Start();
}
public void Dispose()
{
_stop.Set();
_worker.Join();
}
public void AsyncThread()
{
...
}
}
In your logging routine, you will want to test if the thread is running and then decide between queuing the log write or directly appending to the log output. This way log messages before and after the async thread will continue to work correctly.
Consider two classes; Producer and Consumer (the same as classical pattern, each with their own threads). Is it possible for Producer to have an Event which Consumer can register to and when the producer triggers the event, the consumer's event handler is run in its own thread? Here are my assumptions:
Consumer does not know if the Producer's event is triggered
within his own thread or another.
Neither Producer nor Consumer are descendants of Control so they don't have
BeginInvoke method inherited.
PS. I'm not trying to implement Producer - Consumer pattern. These are two simple classes which I'm trying to refactor the producer so it incorporates threads.
[UPDATE]
To further expand my problem, I'm trying to wrap a hardware driver to be worked with in the simplest way possible. For instance my wrapper will have a StateChanged event which the main application will register to so it will be notified when hardware is disconnected. As the actual driver has no means other than polling to check its presence , I will need to start a thread to check it periodically. Once it is not available anymore I will trigger the event which needs to be executed in the same thread as it was added. I know this is a classical Producer-Consumer pattern but since I'm trying to simplify using my driver-wrapper, I don't want the user code to implement consumer.
[UPDATE]
Due to some comments suggesting that there's no solution to this problem, I would like to add few lines which might change their minds. Considering the BeginInvoke can do what I want, so it shouldn't be impossible (at least in theory). Implementing my own BeginInvoke and calling it within the Producer is one way to look at it. It's just that I don't know how BeginInvoke does it!
You want to do inter thread communication. Yes it is possible.
Use System.Windows.Threading.Dispatcher
http://msdn.microsoft.com/en-us/library/system.windows.threading.dispatcher.aspx
The Dispatcher maintains a prioritized queue of work items for a specific thread.
When a Dispatcher is created on a thread, it becomes the only Dispatcher that can be associated with the thread, even if the Dispatcher is shut down.
If you attempt to get the CurrentDispatcher for the current thread and a Dispatcher is not associated with the thread, a Dispatcher will be created. A Dispatcher is also created when you create a DispatcherObject. If you create a Dispatcher on a background thread, be sure to shut down the dispatcher before exiting the thread.
Yes there is a way to do this. It relies on using the SynchronizationContext class (docs). The sync context abstracts the operations of sending messages from one thread to another via the methods Send (synchronous for the calling thread) and Post(async for the calling thread).
Let's take a slightly simpler situation where you only want the capture one sync context, the context of the "creator" thread. You would do something like this:
using System.Threading;
class HardwareEvents
{
private SynchronizationContext context;
private Timer timer;
public HardwareEvents()
{
context = SynchronizationContext.Current ?? new SynchronizationContext();
timer = new Timer(TimerMethod, null, 0, 1000); // start immediately, 1 sec interval.
}
private void TimerMethod(object state)
{
bool hardwareStateChanged = GetHardwareState();
if (hardwareStateChanged)
context.Post(s => StateChanged(this, EventArgs.Empty), null);
}
public event EventHandler StateChanged;
private bool GetHardwareState()
{
// do something to get the state here.
return true;
}
}
Now, the creating thread's sync context will be used when events are invoked. If the creating thread was a UI thread it will have a sync context supplied by the framework. If there is no sync context, then the default implementation is used, which invokes on the thread pool. SynchronizationContext is a class that you can subclass if you want to provide a custom way to send a message from the producer to the consumer thread. Just override Post and Send to send said message.
If you wanted every event subscriber to get called back on their own thread, you would have to capture the sync context in the add method. You then hold on to pairs of sync contexts and delegates. Then when raising the event, you would loop through the sync context / delegate pairs and Post each one in turn.
There are several other ways you could improve this. For example, you may want to suspend polling the hardware if there no subscribers to the event. Or you might want to back off your polling frequency if the hardware does not respond.
First, please note that in .NET / the Base Class Library, it is usually the event subscriber's obligation to ensure that its callback code is executing on the correct thread. That makes it easy for the event producer: it may just trigger its event without having to care about any thread affinities of its various subscribers.
Here's a complete example step-by-step of a possible implementation.
Let's start with something simple: The Producer class and its event, Event. My example won't include how and when this event gets triggered:
class Producer
{
public event EventHandler Event; // raised e.g. with `Event(this, EventArgs.Empty);`
}
Next, we want to be able to subscribe our Consumer instances to this event and be called back on a specific thread (I'll call this kind of thread a "worker thread"):
class Consumer
{
public void SubscribeToEventOf(Producer producer, WorkerThread targetWorkerThread) {…}
}
How do we implement this?
First, we need the means to "send" code to a specific worker thread. Since there is no way to force a thread to execute a particular method whenever you want it to, you must arrange for a worker thread to explicitly wait for work items. One way to do this is via a work item queue. Here's a possible implementation for WorkerThread:
sealed class WorkerThread
{
public WorkerThread()
{
this.workItems = new Queue<Action>();
this.workItemAvailable = new AutoResetEvent(initialState: false);
new Thread(ProcessWorkItems) { IsBackground = true }.Start();
}
readonly Queue<Action> workItems;
readonly AutoResetEvent workItemAvailable;
public void QueueWorkItem(Action workItem)
{
lock (workItems) // this is not extensively tested btw.
{
workItems.Enqueue(workItem);
}
workItemAvailable.Set();
}
void ProcessWorkItems()
{
for (;;)
{
workItemAvailable.WaitOne();
Action workItem;
lock (workItems) // dito, not extensively tested.
{
workItem = workItems.Dequeue();
if (workItems.Count > 0) workItemAvailable.Set();
}
workItem.Invoke();
}
}
}
This class basically starts a thread, and puts it in an infinite loop that falls asleep (WaitOne) until an item arrives in its queue (workItems). Once that happens, the item — an Action — is dequeued and invoked. Then the thread goes to sleep again (WaitOne)) until another item is available in the queue.
Actions are put in the queue via the QueueWorkItem method. So essentially we can now send code to be executed to a specific WorkerThread instance by calling that method. We're now ready to implement Customer.SubscribeToEventOf:
class Consumer
{
public void SubscribeToEventOf(Producer producer, WorkerThread targetWorkerThread)
{
producer.Event += delegate(object sender, EventArgs e)
{
targetWorkerThread.QueueWorkItem(() => OnEvent(sender, e));
};
}
protected virtual void OnEvent(object sender, EventArgs e)
{
// this code is executed on the worker thread(s) passed to `Subscribe…`.
}
}
Voilà!
P.S. (not discussed in detail): As an add-on, you could package the method of sending code to WorkerThread using a standard .NET mechanism called a SynchronizationContext:
sealed class WorkerThreadSynchronizationContext : SynchronizationContext
{
public WorkerThreadSynchronizationContext(WorkerThread workerThread)
{
this.workerThread = workerThread;
}
private readonly WorkerThread workerThread;
public override void Post(SendOrPostCallback d, object state)
{
workerThread.QueueWorkItem(() => d(state));
}
// other overrides for `Send` etc. omitted
}
And at the beginning of WorkerThread.ProcessWorkItems, you'd set the synchronization context for that particular thread as follows:
SynchronizationContext.SetSynchronizationContext(
new WorkerThreadSynchronizationContext(this));
I posted earlier that I've been there, and that there is no nice solution.
However, I just stumbled upon something I have done in another context before: you could instantiate a timer (that is, Windows.Forms.Timer) when you create your wrapper object. This timer will post all Tick events to the ui thread.
Now if you're device polling logic is non-blocking and fast, you could implement it directly inside the timer Tick event, and raise your custom event there.
Otherwise, you could continue to do the polling logic inside a thread, and instead of firing the event inside the thread, you just flip some boolean variable which gets read by the timer every 10 ms, who then fires the event.
Note that this solution still requires that the object is created from the GUI thread, but at least the user of the object will not have to worry about Invoke.
It is possible. One typical approach is to use the BlockingCollection class. This data structure works like a normal queue except that the dequeue operation blocks the calling thread if the queue is empty. The produce will queue items by calling Add and the consumer will dequeue them by calling Take. The consumer typically runs it's own dedicated thread spinning an infinite loop waiting for items to appear in the queue. This is, more or less, how the message loop on the UI thread operates and is the basis for getting the Invoke and BeginInvoke operations to accomplish the marshaling behavior.
public class Consumer
{
private BlockingCollection<Action> queue = new BlockingCollection<Action>();
public Consumer()
{
var thread = new Thread(
() =>
{
while (true)
{
Action method = queue.Take();
method();
}
});
thread.Start();
}
public void BeginInvoke(Action method)
{
queue.Add(item);
}
}
I am writing a GUI app for monitoring and I'd like to get some advice on it's logic. Basically all the app needs to do is connect to a distant server every x minutes, check if something was changed, get changes if any, act upon them (update local db and so on, depending on changes).
My first idea was:
Have a checkbox (monitoring on/off). On click (if checked) starts a Timer.
Timer launches a BackgroundWorker in it's Tick method.
DoWork method does the connecting / retrieving info stuff
a) on WorkDone handler method gets the info from background worker and does local updates with it
b) on WorkDone handler method triggers one or more of custom events "SomethingChanged" depending on changes it got; EventListeners handle local updates from there.
My main problem is calling Worker from Timer since I added Worker to the Form and now they are on different threads (is that correct description?) and then passing results around is a similar problem. I was reading about delegates but still not sure what to use when and how, and if it's really necessary in the first place. Do I need both bgWorker and Timer? Do I need custom events or can I just do all work inside workDone with Switch(result)? Is this general principle good in the first place, maybe there's something better and I am reinventing the wheel? Thank you in advance!
From an architecture point of view:
Message Queues decouple bits of your application. You can in Windows Forms applications rely on the Message Queue that Windows creates and manages for you. Google for PostMessage/GetMessage etc. This is generally called "message passing".
Typical Arcitecture
One part of your app "pushes" a request into a queue
Some other part of your app "pulls" a request from a queue and writes a result to a second queue.
The first part can then "pull" requests from the second "results" queue and display to a user.
So it looks like this:
App --> REQUESTS QUEUE --> processing engine --> RESULTS QUEUE --> App
The processing engine could be in the same app, on the same thread or in a different thread/process (or even different machine).
You can use simple queues : say a Queue<string>() (as long as you use locks to access it) or increase complexity or more and more complex/functional queues.
Issues with the naive strategy and other solutions ... things to think about:
What happens if you make a new request while the old one has not yet completed?
What happens if an error occurs? Do you want errors inline? You can use another queue for errors?
Do you want retries?
What happens if a message is lost? (i.e. a request was pushed, but no response comes in...)? There are transactional queues etc.
Sample Code
object oLock = new object();
Queue<string> requests = new Queue<string>();
Queue<string> responses = new Queue<string>();
Thread mThread;
AutoResetEvent mEvent = new AutoResetEvent(false);
public Form1()
{
InitializeComponent();
mThread = new Thread(ProcessingEngine);
mThread.IsBackground = true;
mThread.Start();
}
private void ProcessingEngine()
{
string result;
string request = null;
while (true)
{
try
{
mEvent.WaitOne();
lock (oLock)
{
request = requests.Dequeue();
}
var wc = new WebClient();
result = wc.DownloadString(request);
lock (oLock)
{
responses.Enqueue(result);
}
}
catch (Exception ex)
{
lock (oLock)
{
responses.Enqueue(ex.ToString());
}
}
}
}
private void timer1_Tick(object sender, EventArgs e)
{
lock (oLock)
{
//Stick in a new request
requests.Enqueue("http://yahoo.com");
//Allow thread to start work
mEvent.Set();
//Check if a response has arrived
if (responses.Any())
{
var result = responses.Dequeue();
listBox1.Items.Add(result.Substring(1,200));
}
}
}
}
If you use System.Windows.Forms.Timer instead of System.Threading.Timer, your Tick handler will be called from Form's message loop and you'll have full access to all controls - it will be safe to call bgWorker.RunWorkerAsync(). As for retrieving results - RunWorkerCompleted is also called from message loop thread and you can safetly update your UI here.
The solution is simple - INVOKE back into the main thread. THere is an Invoke method on the winform control. This will basically change threads for execution to the UI thread, and allow you to manipulate the UI.
Do that "block" (i.e. not once per control but once when you have news).
I'm trying to make cross-threaded calls in C#.
Whenever I invoke the methods of an object created in the context of thread A from a static method called from thread B, the method always runs in thread B. I don't want that, I want it run on the same thread as the thread A object whose methods I am calling.
Invoke works fine for UI calls and I've read dozens of articles and SO answers relating to different ways of making cross-threaded Forms/WPF calls. However whatever I try (event handling, delegates, etc) Thread A's object's method will always run in Thread B if it is invoked by Thread B.
What part of the library should I be looking in to solve this? If it's relevant, Thread B currently 'spins', reads from a network port and occasionally invokes Thread A's object's method through a delegate that was created in Thread A and passed in using a ParameterizedThreadStart.
I'm not looking to change paradigm, just send a message (a request to invoke a method) from one thread (Thread B) to another (Thread A).
EDIT:
My question was 'what part of the library should I be looking in to solve this?' The answer appears to be none. If I want to clearly delineate consumption and polling I'll have to write my own code to handle that.
Whenever I invoke the methods of an object running on thread A
Objects don't run on threads.
In order for this to work, you will have to create some kind of queue you can shove a delegate into that will be routinely checked thread A's main loop. Something like this, assuming that Something.MainThreadLoop is the entry point for thread A:
public class Something
{
private Queue<Action> actionQueue = new Queue<Action>();
private volatile bool threadRunning = true;
public void RunOnThread(Action action)
{
if (action == null)
throw new ArgumentNullException("action");
lock (actionQueue)
actionQueue.Enqueue(action);
}
public void Stop()
{
threadRunning = false;
}
private void RunPendingActions()
{
while (actionQueue.Count > 0) {
Action action;
lock (actionQueue)
action = actionQueue.Dequeue();
action();
}
}
public void MainThreadLoop()
{
while (threadRunning) {
// Do the stuff you were already doing on this thread.
// Then, periodically...
RunPendingActions();
}
}
}
Then, given a reference to a Something object, you could do this:
something.RunOnThread(() => Console.WriteLine("I was printed from thread A!"));
Code runs on threads. Objects aren't (generally - see thread local) bound to a particular thread. By doing WinFormControl.Invoke or WPFControl.Invoke, you are posting a message to the Message Pump or Dispatcher respectively, to run some code at a later date.
The message pump is something like this:
Message message;
while(GetMessage(&message))
{
ProcessMessage(message);
}
Microsoft has specifically built their UI controls and projects to allow the posting of messages across threads. Calling a method from thread A will always execute that method on thread A, even if it ends up doing some kind of asynchronous work and returning early.
Edit:
What it is I think you need is the Producer Consumer pattern.
http://msdn.microsoft.com/en-us/library/yy12yx1f(VS.80).aspx
Forget about consuming the messages from your main thread, which is what it sounds like you want to do. Consume from thread C.
Thread A is doing 'much more important things'. Thread B is spinning, listening for messages. Thread C is consuming those messages.
No need for marshalling across threads.
EDIT: I think you probably want to use the System.Threading.AutoResetEvent class. The MSDN documentation has a decent example of one thread waiting on the other that I think is similar to what you are trying to do: http://msdn.microsoft.com/en-us/library/system.threading.autoresetevent.aspx
In particular, pay attention to the calls to trigger.WaitOne() and trigger.Set()
EDIT2: Added option #3 after reading new comment from OP.
"Whenever I invoke the methods of an object running on thread A ..." - An object doesn't "run" on a thread and isn't really owned by any thread, regardless of what thread created the object.
Given that your question is regarding "non-UI cross-thread invocation", I assume you are already familiar with "UI cross-thread invocation". I can see how WinForms would give you an impression that a thread owns an object and that you need to "send a message" to a thread in order to make it do something.
WinForm control objects are kind of a special case in that they simply don't function properly if you interact with them with a thread that isn't the one that created them, but that's not caused by the way that threads and objects interact.
Anyway, on to addressing your question.
First, a question to clarify the problem: You've mentioned what Thread B is doing, but what is Thread A doing prior to being "invoked" by Thread B?
Here are a couple of ideas that I think are along the lines of what you want to do:
Don't create Thread A until you need to. Instead of having Thread B "send a message to Thread A", rather have Thread B create Thread A (or call it Thread C if you prefer) and make it start executing at that time.
If you need Thread A to already exist and you only want Thread A to handle Thread B's events one at a time, you could have Thread A wait until it receives a notification from Thread B. Take a look at the System.Threading.WaitHandle class (derived classes of interest are ManualResetEvent and AutoResetEvent).
Thread A will at some point call WaitHandle.WaitOne(), which will cause it to pause and wait until Thread B calls WaitHandle.Set() on the same WaitHandle object.
If Thread A is busy doing other things, then you might want to set up some kind of flag variable. Similar to the WaitHandle concept in #2, but instead of causing Thread A to pause, you just want Thread B to set a flag (perhaps just a boolean variable) that will signal to Thread A that it needs to do something. While Thread A is busy doing other things, it can periodically check that flag to decide whether or not there is work that needs to be done.
Does the method that Thread A will execute on your object require any input from Thread B? Then before Thread B calls WaitHandle.Set(), have it stick some data into a queue or something. Then, when Thread A is "activated", it can retrieve that data from the queue and proceed to execute the object's method using that data. Use a lock mechanism (i.e. the C# lock statement) to synchronize access to the queue.
What you're going to have to do is roll a sort of Queue and have Thread A watch that queue for work. When Thread A sees new work enter the queue, it can dequeue it and do the work, then return to waiting for more.
Here's some pseudo-code:
public class ThreadAQueue
{
private Queue<delegate> _queue;
private bool _quitWorking;
public void EnqueueSomeWork(delegate work)
{
lock(_queue)
{
_queue.Enqueue(work);
}
}
private void DoTheWork()
{
while(!quitWorking)
{
delegate myWork;
lock(_queue)
{
if(_queue.Count > 1)
myWork = _queue.Dequeue();
}
myWork();
}
}
}
I have a class that implements the Begin/End Invocation pattern where I initially used ThreadPool.QueueUserWorkItem() to thread my work. The work done on the thread doesn't loop but does takes a bit of time to process so the work itself is not easily stopped.
I now have the side effect where someone using my class is calling the Begin (with callback) a ton of times to do a lot of processing so ThreadPool.QueueUserWorkItem is creating a ton of threads to do the processing. That in itself isn't bad but there are instances where they want to abandon the processing and start a new process but they are forced to wait for their first request to finish.
Since ThreadPool.QueueUseWorkItem() doesn't allow me to cancel the threads I am trying to come up with a better way to queue up the work and maybe use an explicit FlushQueue() method in my class to allow the caller to abandon work in my queue.
Anyone have any suggestion on a threading pattern that fits my needs?
Edit: I'm currently targeting the 2.0 framework. I'm currently thinking that a Consumer/Producer queue might work. Does anyone have thoughts on the idea of flushing the queue?
Edit 2 Problem Clarification:
Since I'm using the Begin/End pattern in my class every time the caller uses the Begin with callback I create a whole new thread on the thread pool. This call does a very small amount of processing and is not where I want to cancel. It's the uncompleted jobs in the queue I wish to stop.
The fact that the ThreadPool will create 250 threads per processor by default means if you ask the ThreadPool to queue a large amount of items with QueueUserWorkItem() you end up creating a huge amount of concurrent threads that you have no way of stopping.
The caller is able to push the CPU to 100% with not only the work but the creation of the work because of the way I queued the threads.
I was thinking by using the Producer/Consumer pattern I could queue these threads into my own queue that would allow me to moderate how many threads I create to avoid the CPU spike creating all the concurrent threads. And that I might be able to allow the caller of my class to flush all the jobs in the queue when they are abandoning the requests.
I am currently trying to implement this myself but figured SO was a good place to have someone say look at this code or you won't be able to flush because of this or flushing isn't the right term you mean this.
EDIT My answer does not apply since OP is using 2.0. Leaving up and switching to CW for anyone who reads this question and using 4.0
If you are using C# 4.0, or can take a depedency on one of the earlier version of the parallel frameworks, you can use their built-in cancellation support. It's not as easy as cancelling a thread but the framework is much more reliable (cancelling a thread is very attractive but also very dangerous).
Reed did an excellent article on this you should take a look at
http://reedcopsey.com/2010/02/17/parallelism-in-net-part-10-cancellation-in-plinq-and-the-parallel-class/
A method I've used in the past, though it's certainly not a best practice is to dedicate a class instance to each thread, and have an abort flag on the class. Then create a ThrowIfAborting method on the class that is called periodically from the thread (particularly if the thread's running a loop, just call it every iteration). If the flag has been set, ThrowIfAborting will simply throw an exception, which is caught in the main method for the thread. Just make sure to clean up your resources as you're aborting.
You could extend the Begin/End pattern to become the Begin/Cancel/End pattern. The Cancel method could set a cancel flag that the worker thread polls periodically. When the worker thread detects a cancel request, it can stop its work, clean-up resources as needed, and report that the operation was canceled as part of the End arguments.
I've solved what I believe to be your exact problem by using a wrapper class around 1+ BackgroundWorker instances.
Unfortunately, I'm not able to post my entire class, but here's the basic concept along with it's limitations.
Usage:
You simply create an instance and call RunOrReplace(...) when you want to cancel your old worker and start a new one. If the old worker was busy, it is asked to cancel and then another worker is used to immediately execute your request.
public class BackgroundWorkerReplaceable : IDisposable
{
BackgroupWorker activeWorker = null;
object activeWorkerSyncRoot = new object();
List<BackgroupWorker> workerPool = new List<BackgroupWorker>();
DoWorkEventHandler doWork;
RunWorkerCompletedEventHandler runWorkerCompleted;
public bool IsBusy
{
get { return activeWorker != null ? activeWorker.IsBusy; : false }
}
public BackgroundWorkerReplaceable(DoWorkEventHandler doWork, RunWorkerCompletedEventHandler runWorkerCompleted)
{
this.doWork = doWork;
this.runWorkerCompleted = runWorkerCompleted;
ResetActiveWorker();
}
public void RunOrReplace(Object param, ...) // Overloads could include ProgressChangedEventHandler and other stuff
{
try
{
lock(activeWorkerSyncRoot)
{
if(activeWorker.IsBusy)
{
ResetActiveWorker();
}
// This works because if IsBusy was false above, there is no way for it to become true without another thread obtaining a lock
if(!activeWorker.IsBusy)
{
// Optionally handle ProgressChangedEventHandler and other features (under the lock!)
// Work on this new param
activeWorker.RunWorkerAsync(param);
}
else
{ // This should never happen since we create new workers when there's none available!
throw new LogicException(...); // assert or similar
}
}
}
catch(...) // InvalidOperationException and Exception
{ // In my experience, it's safe to just show the user an error and ignore these, but that's going to depend on what you use this for and where you want the exception handling to be
}
}
public void Cancel()
{
ResetActiveWorker();
}
public void Dispose()
{ // You should implement a proper Dispose/Finalizer pattern
if(activeWorker != null)
{
activeWorker.CancelAsync();
}
foreach(BackgroundWorker worker in workerPool)
{
worker.CancelAsync();
worker.Dispose();
// perhaps use a for loop instead so you can set worker to null? This might help the GC, but it's probably not needed
}
}
void ResetActiveWorker()
{
lock(activeWorkerSyncRoot)
{
if(activeWorker == null)
{
activeWorker = GetAvailableWorker();
}
else if(activeWorker.IsBusy)
{ // Current worker is busy - issue a cancel and set another active worker
activeWorker.CancelAsync(); // Make sure WorkerSupportsCancellation must be set to true [Link9372]
// Optionally handle ProgressEventHandler -=
activeWorker = GetAvailableWorker(); // Ensure that the activeWorker is available
}
//else - do nothing, activeWorker is already ready for work!
}
}
BackgroupdWorker GetAvailableWorker()
{
// Loop through workerPool and return a worker if IsBusy is false
// if the loop exits without returning...
if(activeWorker != null)
{
workerPool.Add(activeWorker); // Save the old worker for possible future use
}
return GenerateNewWorker();
}
BackgroundWorker GenerateNewWorker()
{
BackgroundWorker worker = new BackgroundWorker();
worker.WorkerSupportsCancellation = true; // [Link9372]
//worker.WorkerReportsProgress
worker.DoWork += doWork;
worker.RunWorkerCompleted += runWorkerCompleted;
// Other stuff
return worker;
}
} // class
Pro/Con:
This has the benefit of having a very low delay in starting your new execution, since new threads don't have to wait for old ones to finish.
This comes at the cost of a theoretical never-ending growth of BackgroundWorker objects that never get GC'd. However, in practice the code below attempts to recycle old workers so you shouldn't normally encounter a large pool of ideal threads. If you are worried about this because of how you plan to use this class, you could implement a Timer which fires a CleanUpExcessWorkers(...) method, or have ResetActiveWorker() do this cleanup (at the cost of a longer RunOrReplace(...) delay).
The main cost from using this is precisely why it's beneficial - it doesn't wait for the previous thread to exit, so for example, if DoWork is performing a database call and you execute RunOrReplace(...) 10 times in rapid succession, the database call might not be immediately canceled when the thread is - so you'll have 10 queries running, making all of them slow! This generally tends to work fine with Oracle, causing only minor delays, but I do not have experiences with other databases (to speed up the cleanup, I have the canceled worker tell Oracle to cancel the command). Proper use of the EventArgs described below mostly solves this.
Another minor cost is that whatever code this BackgroundWorker is performing must be compatible with this concept - it must be able to safely recover from being canceled. The DoWorkEventArgs and RunWorkerCompletedEventArgs have a Cancel/Cancelled property which you should use. For example, if you do Database calls in the DoWork method (mainly what I use this class for), you need to make sure you periodically check these properties and take perform the appropriate clean-up.