Is that possible not not block winForm using WaitHandle.WaitAll(waitHandles) but just set another thread which will fire when get complate signal from WaitHandle.WaitAll?
I would not use WaitHandle.WaitAll. There are a couple of problems with this approach.
There is a 64 handle limit.
It cannot be used on an STA thread.
It promotes patterns that depend on the creation of multiple WaitHandle instances which obviously consume resource.
Instead, I typically use the CountdownEvent class when I want to wait on multiple events. Now, the problem you will have with that is that it still requires you to call Wait on some thread which is exactly what you are trying to avoid. The standard mechanism to avoid making a blocking call is to use the ThreadPool.RegisterWaitForSingleObject method. But, unfortunately that takes a WaitHandle and CountdownEvent does not inherit from that class.
The solution is to create your own CountdownWaitHandle class that can be used in the ThreadPool.RegisterWaitForSingleObject method. This approach will allow you to specify a callback delegate that will be executed once the WaitHandle is signaled.
Here is the most basic implemenation for the CountdownWaitHandle class. You will have to add all of the necessary harding code yourself, but this will get you started.
public class CountdownWaitHandle : WaitHandle
{
private int m_Count = 0;
private ManualResetEvent m_Event = new ManualResetEvent(false);
public CountdownWaitHandle(int initialCount)
{
m_Count = initialCount;
}
public void AddCount()
{
Interlocked.Increment(ref m_Count);
}
public void Signal()
{
if (Interlocked.Decrement(ref m_Count) == 0)
{
m_Event.Set();
}
}
public override bool WaitOne()
{
return m_Event.WaitOne();
}
}
The idea here is that instead of using many different WaitHandle instances you use a single CountdownWaitHandle instance. Initialize the instance with the desired count and then call Signal to decrement the count. Once the count gets to zero the WaitHandle will go into the signaled state. So instead of calling Set on multiple WaitHandle instances and blocking with WaitHandle.WaitAll you now call Signal on this one instance and block by calling WaitOne. And again, you can push off the blocking call to the thread pool by using TheadPool.RegisterWaitForSingleObject which will invoke a callback when the WaitHandle is signaled.
You can call WaitAll in a background thread, then call BeginInvoke to move back to the UI thread.
Have a look at ManualResetEvent. Using this you can set the event when your thread has finished, and any other thread can either wait on this event, or check to see if it is in the signalled state.
ManualResetEvent ev = new ManualReserEvent();
while(Users["user428547"].AcceptanceRate == 0)
{
// this might take a long time
};
ev.Set(); // done, he accepted an answer.
Perhaps you could start another thread yourself, and call WaitHandle.WaitAll yourself on that thread? If you are not starting too many other threads, this should work fine.
Related
I have a class ClassA
public class ClassA
{
public ClassA()
{
Thread t = new Thread(EndlessLoop);
t.IsBackground = True;
t.Start();
}
private void EndlessLoop()
{
while (True)
{
// do something
}
}
}
and I'm not sure if the thread will be disposed if I set ClassA object to null
ClassA a = new ClassA();
# will the thread exit ?
a = null;
Or maybe I should implement IDisposable, and call it manually?
Once started, the thread will terminate after the routine comes to an end (or Thread.Abort is invoked or the program exits). The Thread class doesn't implement IDisposable so there's no Dispose method to call. To terminate a long-running thread, you could set a flag that the thread checks periodically.
The Thread object is eligible to be garbage collected once it goes out of scope and is no longer referenced. However, the spawned thread will continue running.
Nothing going to happen to the OS thread if you remove last refence to the Thread object corresponding to it - C# Thread object lifetime. The thread will continue to run the code until the method finishes (unlikely in while(true) case shown), thread is terminated with Abort (don't do that - What's wrong with using Thread.Abort()) or process ends.
The only good option is to somehow notify thread's code that it should finish (i.e. using events or even global variable protected by lock). Also consider if using Task and async with corresponding cancellation mechanism would simplify code (it would not solve infinite loop issue but give good framework to write cancellable operations).
Note that you can't "dispose" thread because it does not implement Dispose (Do we need to dispose or terminate a thread in C# after usage?),
I have a code which synchronize threads via AutoResetEvent
Basically there are two threads which swap control and execute commands , each thread at a time.
Code :
static EventWaitHandle _waitHandle = new AutoResetEvent(false);
static void Waiter()
{
_waitHandle.WaitOne();
Console.WriteLine("A...");
_waitHandle.Set();
_waitHandle.WaitOne();
Console.WriteLine("A2...");
_waitHandle.Set();
}
static void Waiter2()
{
_waitHandle.WaitOne();
Console.WriteLine("B...");
_waitHandle.Set();
_waitHandle.WaitOne();
Console.WriteLine("B2...");
}
void Main()
{
new Thread(Waiter).Start();
new Thread(Waiter2).Start();
_waitHandle.Set(); // Wake up the Waiter.
}
Result : (I always get this result)
A...
B...
A2...
B2...
However - when I move to Tasks :
Task.Run(()=>Waiter());
Task.Run(()=>Waiter2());
I sometimes get :
B...
A...
B2...
Which is clear to me because the task scheduler scheduled the second task to execute first.
Which leads me to ask :
Questions
1) Do threads order guaranteed to be the same as order of invocation in :
new Thread(Waiter).Start();
new Thread(Waiter2).Start();
//In other words , will I always get the first result ?
2) How can I Force the Task.Runs to be invoked the same order as I invoke them?
No, it is not guaranteed, you just got lucky that the output was the same every time.
Add in a 2nd AutoResetEvent that has a WaitOne between the two tasks and a Set in at the start of the Waiter method.
Without a synchronization mechanism, you cannot guarantee the order in which a thread will start and/or execute. Furthermore, a thread's execution may be preempted (think: "paused") at any time.
So to answer your questions:
No
No
Before moving forward, you should ask yourself "Do I really need to use threads to solve this problem?"
My favorite quote from the Microsoft's MSDN:
"When you use multithreading of any sort, you potentially expose yourself to very serious and complex bugs" [Best Practices for Implementing the Event-based Asynchronous Pattern]
If you do need to introduce threads, then I would begin by familiarizing yourself with some of Microsoft's synchronization mechanisms:
Critical Section
Mutex
Events
Auto Reset
Manual Reset
Consider two classes; Producer and Consumer (the same as classical pattern, each with their own threads). Is it possible for Producer to have an Event which Consumer can register to and when the producer triggers the event, the consumer's event handler is run in its own thread? Here are my assumptions:
Consumer does not know if the Producer's event is triggered
within his own thread or another.
Neither Producer nor Consumer are descendants of Control so they don't have
BeginInvoke method inherited.
PS. I'm not trying to implement Producer - Consumer pattern. These are two simple classes which I'm trying to refactor the producer so it incorporates threads.
[UPDATE]
To further expand my problem, I'm trying to wrap a hardware driver to be worked with in the simplest way possible. For instance my wrapper will have a StateChanged event which the main application will register to so it will be notified when hardware is disconnected. As the actual driver has no means other than polling to check its presence , I will need to start a thread to check it periodically. Once it is not available anymore I will trigger the event which needs to be executed in the same thread as it was added. I know this is a classical Producer-Consumer pattern but since I'm trying to simplify using my driver-wrapper, I don't want the user code to implement consumer.
[UPDATE]
Due to some comments suggesting that there's no solution to this problem, I would like to add few lines which might change their minds. Considering the BeginInvoke can do what I want, so it shouldn't be impossible (at least in theory). Implementing my own BeginInvoke and calling it within the Producer is one way to look at it. It's just that I don't know how BeginInvoke does it!
You want to do inter thread communication. Yes it is possible.
Use System.Windows.Threading.Dispatcher
http://msdn.microsoft.com/en-us/library/system.windows.threading.dispatcher.aspx
The Dispatcher maintains a prioritized queue of work items for a specific thread.
When a Dispatcher is created on a thread, it becomes the only Dispatcher that can be associated with the thread, even if the Dispatcher is shut down.
If you attempt to get the CurrentDispatcher for the current thread and a Dispatcher is not associated with the thread, a Dispatcher will be created. A Dispatcher is also created when you create a DispatcherObject. If you create a Dispatcher on a background thread, be sure to shut down the dispatcher before exiting the thread.
Yes there is a way to do this. It relies on using the SynchronizationContext class (docs). The sync context abstracts the operations of sending messages from one thread to another via the methods Send (synchronous for the calling thread) and Post(async for the calling thread).
Let's take a slightly simpler situation where you only want the capture one sync context, the context of the "creator" thread. You would do something like this:
using System.Threading;
class HardwareEvents
{
private SynchronizationContext context;
private Timer timer;
public HardwareEvents()
{
context = SynchronizationContext.Current ?? new SynchronizationContext();
timer = new Timer(TimerMethod, null, 0, 1000); // start immediately, 1 sec interval.
}
private void TimerMethod(object state)
{
bool hardwareStateChanged = GetHardwareState();
if (hardwareStateChanged)
context.Post(s => StateChanged(this, EventArgs.Empty), null);
}
public event EventHandler StateChanged;
private bool GetHardwareState()
{
// do something to get the state here.
return true;
}
}
Now, the creating thread's sync context will be used when events are invoked. If the creating thread was a UI thread it will have a sync context supplied by the framework. If there is no sync context, then the default implementation is used, which invokes on the thread pool. SynchronizationContext is a class that you can subclass if you want to provide a custom way to send a message from the producer to the consumer thread. Just override Post and Send to send said message.
If you wanted every event subscriber to get called back on their own thread, you would have to capture the sync context in the add method. You then hold on to pairs of sync contexts and delegates. Then when raising the event, you would loop through the sync context / delegate pairs and Post each one in turn.
There are several other ways you could improve this. For example, you may want to suspend polling the hardware if there no subscribers to the event. Or you might want to back off your polling frequency if the hardware does not respond.
First, please note that in .NET / the Base Class Library, it is usually the event subscriber's obligation to ensure that its callback code is executing on the correct thread. That makes it easy for the event producer: it may just trigger its event without having to care about any thread affinities of its various subscribers.
Here's a complete example step-by-step of a possible implementation.
Let's start with something simple: The Producer class and its event, Event. My example won't include how and when this event gets triggered:
class Producer
{
public event EventHandler Event; // raised e.g. with `Event(this, EventArgs.Empty);`
}
Next, we want to be able to subscribe our Consumer instances to this event and be called back on a specific thread (I'll call this kind of thread a "worker thread"):
class Consumer
{
public void SubscribeToEventOf(Producer producer, WorkerThread targetWorkerThread) {…}
}
How do we implement this?
First, we need the means to "send" code to a specific worker thread. Since there is no way to force a thread to execute a particular method whenever you want it to, you must arrange for a worker thread to explicitly wait for work items. One way to do this is via a work item queue. Here's a possible implementation for WorkerThread:
sealed class WorkerThread
{
public WorkerThread()
{
this.workItems = new Queue<Action>();
this.workItemAvailable = new AutoResetEvent(initialState: false);
new Thread(ProcessWorkItems) { IsBackground = true }.Start();
}
readonly Queue<Action> workItems;
readonly AutoResetEvent workItemAvailable;
public void QueueWorkItem(Action workItem)
{
lock (workItems) // this is not extensively tested btw.
{
workItems.Enqueue(workItem);
}
workItemAvailable.Set();
}
void ProcessWorkItems()
{
for (;;)
{
workItemAvailable.WaitOne();
Action workItem;
lock (workItems) // dito, not extensively tested.
{
workItem = workItems.Dequeue();
if (workItems.Count > 0) workItemAvailable.Set();
}
workItem.Invoke();
}
}
}
This class basically starts a thread, and puts it in an infinite loop that falls asleep (WaitOne) until an item arrives in its queue (workItems). Once that happens, the item — an Action — is dequeued and invoked. Then the thread goes to sleep again (WaitOne)) until another item is available in the queue.
Actions are put in the queue via the QueueWorkItem method. So essentially we can now send code to be executed to a specific WorkerThread instance by calling that method. We're now ready to implement Customer.SubscribeToEventOf:
class Consumer
{
public void SubscribeToEventOf(Producer producer, WorkerThread targetWorkerThread)
{
producer.Event += delegate(object sender, EventArgs e)
{
targetWorkerThread.QueueWorkItem(() => OnEvent(sender, e));
};
}
protected virtual void OnEvent(object sender, EventArgs e)
{
// this code is executed on the worker thread(s) passed to `Subscribe…`.
}
}
Voilà!
P.S. (not discussed in detail): As an add-on, you could package the method of sending code to WorkerThread using a standard .NET mechanism called a SynchronizationContext:
sealed class WorkerThreadSynchronizationContext : SynchronizationContext
{
public WorkerThreadSynchronizationContext(WorkerThread workerThread)
{
this.workerThread = workerThread;
}
private readonly WorkerThread workerThread;
public override void Post(SendOrPostCallback d, object state)
{
workerThread.QueueWorkItem(() => d(state));
}
// other overrides for `Send` etc. omitted
}
And at the beginning of WorkerThread.ProcessWorkItems, you'd set the synchronization context for that particular thread as follows:
SynchronizationContext.SetSynchronizationContext(
new WorkerThreadSynchronizationContext(this));
I posted earlier that I've been there, and that there is no nice solution.
However, I just stumbled upon something I have done in another context before: you could instantiate a timer (that is, Windows.Forms.Timer) when you create your wrapper object. This timer will post all Tick events to the ui thread.
Now if you're device polling logic is non-blocking and fast, you could implement it directly inside the timer Tick event, and raise your custom event there.
Otherwise, you could continue to do the polling logic inside a thread, and instead of firing the event inside the thread, you just flip some boolean variable which gets read by the timer every 10 ms, who then fires the event.
Note that this solution still requires that the object is created from the GUI thread, but at least the user of the object will not have to worry about Invoke.
It is possible. One typical approach is to use the BlockingCollection class. This data structure works like a normal queue except that the dequeue operation blocks the calling thread if the queue is empty. The produce will queue items by calling Add and the consumer will dequeue them by calling Take. The consumer typically runs it's own dedicated thread spinning an infinite loop waiting for items to appear in the queue. This is, more or less, how the message loop on the UI thread operates and is the basis for getting the Invoke and BeginInvoke operations to accomplish the marshaling behavior.
public class Consumer
{
private BlockingCollection<Action> queue = new BlockingCollection<Action>();
public Consumer()
{
var thread = new Thread(
() =>
{
while (true)
{
Action method = queue.Take();
method();
}
});
thread.Start();
}
public void BeginInvoke(Action method)
{
queue.Add(item);
}
}
In the following code TimerRecalcStatisticsElapsed should only have one instance of it running. The worker methods that this callback invokes is made to run in sequence, with a maximum of one thread running at a time.
Question Part 1:
If the timer's callback runs an a threadpool thread (as opposed to running the callback on a separate thread), is it correct to say the the threadpool might queue and defer the thread for later execution based on conditions (MaxThreads reached, threadpool internal logic)?
Question Part 2:
Assuming it's possible for one timer callback to be queued for anything but immediate execution, does that mean that any number of thread callbacks may execute concurrently?
Question Part 3
Assuming part 2 is true, does that mean the code below can ever have more than one callback operating at the same time?
The reason I'm asking is because there are several thousand instances of this class running on a multi-CPU server. I'm also seeing data corruption consistent with an out-of-order operation of // Do Work Here.
Aside
// Do work here internally works with a System.Collections.Dictionary and edits the values of y. It also removes some keys for a subsequent function that is called serially. That function is missing keys (x) that were previously present in the first call. I think this is because there is a race condition with the final statement obj.cleanupdata()
public class SystemTimerTest
{
readonly System.Timers.Timer timerRecalcStatistics;
readonly System.Diagnostics.Stopwatch stopwatchForRecalcStatistics = new System.Diagnostics.Stopwatch();
public SystemTimerTest(TimeSpan range, DataOverwriteAction action)
{
int recalculateStatisticsEveryXMillseconds = 1000;
timerRecalcStatistics = new System.Timers.Timer(recalculateStatisticsEveryXMillseconds);
timerRecalcStatistics.AutoReset = true;
timerRecalcStatistics.Elapsed += new System.Timers.ElapsedEventHandler(TimerRecalcStatisticsElapsed);
timerRecalcStatistics.Interval = recalculateStatisticsEveryXMillseconds;
timerRecalcStatistics.Enabled = true;
this.maxRange = range;
this.hashRunningTotalDB = new HashRunningTotalDB(action);
this.hashesByDate = new HashesByDate(action);
this.dataOverwriteAction = action;
}
private void TimerRecalcStatisticsElapsed(object source, System.Timers.ElapsedEventArgs e)
{
stopwatchForRecalcStatistics.Start();
Console.WriteLine("The TimerRecalcStatisticsElapsed event was raised at {0}", e.SignalTime.ToString("o"));
// DO WORK HERE
stopwatchForRecalcStatistics.Stop();
double timeBuffer = GetInterval(IntervalTypeEnum.NearestSecond, e.SignalTime) - stopwatchForRecalcStatistics.ElapsedMilliseconds;
if (timeBuffer > 0)
timerRecalcStatistics.Interval = timeBuffer;
else
timerRecalcStatistics.Interval = 1;
stopwatchForRecalcStatistics.Reset();
timerRecalcStatistics.Enabled = true;
}
}
ad 1) It is not important whether ThreadPool can defer execution of callback method, because anyway callback is not guaranteed to complete execution before another timer interval(s) elapses (thread can be suspended by thread scheduler for example, or callback might call long-running function).
ad 2) This is what MSDN says about Timer class:
If the SynchronizingObject property is null, the Elapsed event is
raised on a ThreadPool thread. If processing of the Elapsed event
lasts longer than Interval, the event might be raised again on another
ThreadPool thread. In this situation, the event handler should be
reentrant.
So the answer is YES, callback can be executing on multiple threads concurrently.
ad 3) YES. And you should either avoid using shared resources (timerRecalcStatistics, stopwatchForRecalcStatistics) in callback method, or synchronize access to these shared resources (for example with lock), or set appropriate object to Timer's SynchronizingObject property, or set AutoReset property of Timer to false (and enable timer again at the end of timer callback).
UPDATE:
I thing that Jon Skeet's answer doesn't solve your problem. Also implementing your own SynchonizingObject is IMHO more complicated than necessary (but it's hard to say without knowing whole problem). I hope this implementation should work (but I didn't tested it):
public class MySynchronizeInvoke : ISynchronizeInvoke
{
private object SyncObject = new Object();
private delegate object InvokeDelegate(Delegate method, object[] args);
public IAsyncResult BeginInvoke(Delegate method, object[] args)
{
ElapsedEventHandler handler = (ElapsedEventHandler)method;
InvokeDelegate D = Invoke;
return D.BeginInvoke(handler, args, CallbackMethod, null);
}
private void CallbackMethod(IAsyncResult ar)
{
AsyncResult result = ar as AsyncResult;
if(result != null)
((InvokeDelegate)result.AsyncDelegate).EndInvoke(ar);
}
public object EndInvoke(IAsyncResult result)
{
result.AsyncWaitHandle.WaitOne();
return null;
}
public object Invoke(Delegate method, object[] args)
{
lock(SyncObject)
{
ElapsedEventHandler handler = (ElapsedEventHandler)method;
handler(args[0], (ElapsedEventArgs)args[1]);
return null;
}
}
public bool InvokeRequired
{
get { return true; }
}
}
From the documentation on System.Timers.Timer:
If the SynchronizingObject property is null, the Elapsed event is
raised on a ThreadPool thread. If processing of the Elapsed event
lasts longer than Interval, the event might be raised again on another
ThreadPool thread. In this situation, the event handler should be
reentrant.
So to answer your questions:
Yes, it runs on a threadpool thread, and is subject to threadpool filling up and deferring like anything else. Given that the threadpool now has a maximum of hundreds of threads, this shouldn't be an issue. If it is, you have bigger problems.
Assuming that you do not set a synchronizing object or otherwise sychronize your callback, yes, multiple callback can overlap. If you give the timer a synchronizing object, it will not 'overlap' events.
The code that you provided does not synchronize it's callback in any way, and so yes it can have multiple overlapping, simultaneously executing copies of your callback. You should synchronize the method using something like a lock statement if you want all of the instances of the class to be synchronized one another, or use the SynchronizingObject of the timer if you want each individual instance of the class to only ever have one callback running at any given time.
I'm trying to make cross-threaded calls in C#.
Whenever I invoke the methods of an object created in the context of thread A from a static method called from thread B, the method always runs in thread B. I don't want that, I want it run on the same thread as the thread A object whose methods I am calling.
Invoke works fine for UI calls and I've read dozens of articles and SO answers relating to different ways of making cross-threaded Forms/WPF calls. However whatever I try (event handling, delegates, etc) Thread A's object's method will always run in Thread B if it is invoked by Thread B.
What part of the library should I be looking in to solve this? If it's relevant, Thread B currently 'spins', reads from a network port and occasionally invokes Thread A's object's method through a delegate that was created in Thread A and passed in using a ParameterizedThreadStart.
I'm not looking to change paradigm, just send a message (a request to invoke a method) from one thread (Thread B) to another (Thread A).
EDIT:
My question was 'what part of the library should I be looking in to solve this?' The answer appears to be none. If I want to clearly delineate consumption and polling I'll have to write my own code to handle that.
Whenever I invoke the methods of an object running on thread A
Objects don't run on threads.
In order for this to work, you will have to create some kind of queue you can shove a delegate into that will be routinely checked thread A's main loop. Something like this, assuming that Something.MainThreadLoop is the entry point for thread A:
public class Something
{
private Queue<Action> actionQueue = new Queue<Action>();
private volatile bool threadRunning = true;
public void RunOnThread(Action action)
{
if (action == null)
throw new ArgumentNullException("action");
lock (actionQueue)
actionQueue.Enqueue(action);
}
public void Stop()
{
threadRunning = false;
}
private void RunPendingActions()
{
while (actionQueue.Count > 0) {
Action action;
lock (actionQueue)
action = actionQueue.Dequeue();
action();
}
}
public void MainThreadLoop()
{
while (threadRunning) {
// Do the stuff you were already doing on this thread.
// Then, periodically...
RunPendingActions();
}
}
}
Then, given a reference to a Something object, you could do this:
something.RunOnThread(() => Console.WriteLine("I was printed from thread A!"));
Code runs on threads. Objects aren't (generally - see thread local) bound to a particular thread. By doing WinFormControl.Invoke or WPFControl.Invoke, you are posting a message to the Message Pump or Dispatcher respectively, to run some code at a later date.
The message pump is something like this:
Message message;
while(GetMessage(&message))
{
ProcessMessage(message);
}
Microsoft has specifically built their UI controls and projects to allow the posting of messages across threads. Calling a method from thread A will always execute that method on thread A, even if it ends up doing some kind of asynchronous work and returning early.
Edit:
What it is I think you need is the Producer Consumer pattern.
http://msdn.microsoft.com/en-us/library/yy12yx1f(VS.80).aspx
Forget about consuming the messages from your main thread, which is what it sounds like you want to do. Consume from thread C.
Thread A is doing 'much more important things'. Thread B is spinning, listening for messages. Thread C is consuming those messages.
No need for marshalling across threads.
EDIT: I think you probably want to use the System.Threading.AutoResetEvent class. The MSDN documentation has a decent example of one thread waiting on the other that I think is similar to what you are trying to do: http://msdn.microsoft.com/en-us/library/system.threading.autoresetevent.aspx
In particular, pay attention to the calls to trigger.WaitOne() and trigger.Set()
EDIT2: Added option #3 after reading new comment from OP.
"Whenever I invoke the methods of an object running on thread A ..." - An object doesn't "run" on a thread and isn't really owned by any thread, regardless of what thread created the object.
Given that your question is regarding "non-UI cross-thread invocation", I assume you are already familiar with "UI cross-thread invocation". I can see how WinForms would give you an impression that a thread owns an object and that you need to "send a message" to a thread in order to make it do something.
WinForm control objects are kind of a special case in that they simply don't function properly if you interact with them with a thread that isn't the one that created them, but that's not caused by the way that threads and objects interact.
Anyway, on to addressing your question.
First, a question to clarify the problem: You've mentioned what Thread B is doing, but what is Thread A doing prior to being "invoked" by Thread B?
Here are a couple of ideas that I think are along the lines of what you want to do:
Don't create Thread A until you need to. Instead of having Thread B "send a message to Thread A", rather have Thread B create Thread A (or call it Thread C if you prefer) and make it start executing at that time.
If you need Thread A to already exist and you only want Thread A to handle Thread B's events one at a time, you could have Thread A wait until it receives a notification from Thread B. Take a look at the System.Threading.WaitHandle class (derived classes of interest are ManualResetEvent and AutoResetEvent).
Thread A will at some point call WaitHandle.WaitOne(), which will cause it to pause and wait until Thread B calls WaitHandle.Set() on the same WaitHandle object.
If Thread A is busy doing other things, then you might want to set up some kind of flag variable. Similar to the WaitHandle concept in #2, but instead of causing Thread A to pause, you just want Thread B to set a flag (perhaps just a boolean variable) that will signal to Thread A that it needs to do something. While Thread A is busy doing other things, it can periodically check that flag to decide whether or not there is work that needs to be done.
Does the method that Thread A will execute on your object require any input from Thread B? Then before Thread B calls WaitHandle.Set(), have it stick some data into a queue or something. Then, when Thread A is "activated", it can retrieve that data from the queue and proceed to execute the object's method using that data. Use a lock mechanism (i.e. the C# lock statement) to synchronize access to the queue.
What you're going to have to do is roll a sort of Queue and have Thread A watch that queue for work. When Thread A sees new work enter the queue, it can dequeue it and do the work, then return to waiting for more.
Here's some pseudo-code:
public class ThreadAQueue
{
private Queue<delegate> _queue;
private bool _quitWorking;
public void EnqueueSomeWork(delegate work)
{
lock(_queue)
{
_queue.Enqueue(work);
}
}
private void DoTheWork()
{
while(!quitWorking)
{
delegate myWork;
lock(_queue)
{
if(_queue.Count > 1)
myWork = _queue.Dequeue();
}
myWork();
}
}
}