WPF Event Aggregator - wait for previous execution to complete - c#

I am working on a WPF application using Prism. I am using EventAggregator to communicate between viewmodels.
public class PublisherViewModel
{
_eventAggregator.GetEvent<RefreshEvent>().Publish("STOCKS");
}
public class SubscriberViewModel
{
public SubscriberViewModel(IEventAggregator ea)
{
ea.GetEvent<RefreshEvent>().Subscribe(RefreshData);
}
void RefreshData(string category)
{
Task.Run(() =>
{
//Long Running Operation
Dispatcher.Invoke(() =>
{
//Refresh UI
});
});
}
}
PublisherViewModel can publish event one after another. However at a SubscriberViewModel as I have a long running Task and is not awaited (which I cannot) the second request coming from publisher start execution right away. At SubscriberViewModel I want to handle all incoming request such that they are executed one after another in the order which they arrive.
I am thinking to handle this using a queue based mechanism.
Could you please suggest me the best practice for the same.
Thanks!!
Update:-
I have used the below approach
public class BlockingQueue<T> wehre T : class
{
private readonly BlockingCollection<JobQueueItem<T>> _jobs;
public BlockingQueue(int upperBound)
{
_jobs = new BlockingCollection<JobQueueItem<T>>(upperBound);
var thread = new Thread(new ThreadStart(OnStart));
thread.IsBackground = true;
thread.Start();
}
public void Enqueue(T parameter, Func<T, Task> action)
{
_jobs.Add(new JobQueueItem<T> { Parameter = parameter, JobAction = action });
}
private void OnStart()
{
foreach (var job in _jobs.GetConsumingEnumerable(CancellationToken.None))
{
if (job != null && job.JobAction != null)
{
job.Action.Invoke(job.Parameter).Wait();
}
}
}
private class JobQueueItem<T>
{
internal T Parameter { get; set; }
internal Func<T, Task> JobAction { get; set; }
}
}
public class SubscriberViewModel
{
BlockingQueue<RefreshEventArgs> RefreshQueue = new ...;
//inside Subscribed method
RefreshQueue.Enqueue(args, RefreshData);
}
Please suggest. Thanks!

I am thinking to handle this using a queue based mechanism.
This is the way to go. Set up a queue (probably an asynchronous queue), push the events in the subscriber and consume them from a worker task.
TPL Dataflow is one option to do this: create an ActionBlock<string> from the handler and post the events to it as they come in.

Related

Producer/ Consumer pattern using threads and EventWaitHandle

I guess it is sort of a code review, but here is my implementation of the producer / consumer pattern. What I would like to know is would there be a case in which the while loops in the ReceivingThread() or SendingThread() methods might stop executing. Please note that EnqueueSend(DataSendEnqeueInfo info) is called from multiple different threads and I probably can't use tasks here since I definitely have to consume commands in a separate thread.
private Thread mReceivingThread;
private Thread mSendingThread;
private Queue<DataRecievedEnqeueInfo> mReceivingThreadQueue;
private Queue<DataSendEnqeueInfo> mSendingThreadQueue;
private readonly object mReceivingQueueLock = new object();
private readonly object mSendingQueueLock = new object();
private bool mIsRunning;
EventWaitHandle mRcWaitHandle;
EventWaitHandle mSeWaitHandle;
private void ReceivingThread()
{
while (mIsRunning)
{
mRcWaitHandle.WaitOne();
DataRecievedEnqeueInfo item = null;
while (mReceivingThreadQueue.Count > 0)
{
lock (mReceivingQueueLock)
{
item = mReceivingThreadQueue.Dequeue();
}
ProcessReceivingItem(item);
}
mRcWaitHandle.Reset();
}
}
private void SendingThread()
{
while (mIsRunning)
{
mSeWaitHandle.WaitOne();
while (mSendingThreadQueue.Count > 0)
{
DataSendEnqeueInfo item = null;
lock (mSendingQueueLock)
{
item = mSendingThreadQueue.Dequeue();
}
ProcessSendingItem(item);
}
mSeWaitHandle.Reset();
}
}
internal void EnqueueRecevingData(DataRecievedEnqeueInfo info)
{
lock (mReceivingQueueLock)
{
mReceivingThreadQueue.Enqueue(info);
mRcWaitHandle.Set();
}
}
public void EnqueueSend(DataSendEnqeueInfo info)
{
lock (mSendingQueueLock)
{
mSendingThreadQueue.Enqueue(info);
mSeWaitHandle.Set();
}
}
P.S the idea here is that am using WaitHandles to put thread to sleep when the queue is empty, and signal them to start when new items are enqueued.
UPDATE
I am just going to leave this https://blogs.msdn.microsoft.com/benwilli/2015/09/10/tasks-are-still-not-threads-and-async-is-not-parallel/ ,for people who might be trying to implement Producer/Consumer pattern using TPL or tasks.
Use a BlockingCollection instead of Queue, EventWaitHandle and lock objects:
public class DataInfo { }
private Thread mReceivingThread;
private Thread mSendingThread;
private BlockingCollection<DataInfo> queue;
private CancellationTokenSource receivingCts = new CancellationTokenSource();
private void ReceivingThread()
{
try
{
while (!receivingCts.IsCancellationRequested)
{
// This will block until an item is added to the queue or the cancellation token is cancelled
DataInfo item = queue.Take(receivingCts.Token);
ProcessReceivingItem(item);
}
}
catch (OperationCanceledException)
{
}
}
internal void EnqueueRecevingData(DataInfo info)
{
// When a new item is produced, just add it to the queue
queue.Add(info);
}
// To cancel the receiving thread, cancel the token
private void CancelReceivingThread()
{
receivingCts.Cancel();
}
Personally, for simple producer-consumer problems, I would just use BlockingCollection. There would be no need to manually code your own synchronization logic. The consuming threads will also block if there are no items present in the queue.
Here is what your code might look like if you use this class:
private BlockingCollection<DataRecievedEnqeueInfo> mReceivingThreadQueue = new BlockingCollection<DataRecievedEnqeueInfo>();
private BlockingCollection<DataSendEnqeueInfo> mSendingThreadQueue = new BlockingCollection<DataSendEnqeueInfo>();
public void Stop()
{
// No need for mIsRunning. Makes the enumerables in the GetConsumingEnumerable() calls
// below to complete.
mReceivingThreadQueue.CompleteAdding();
mSendingThreadQueue.CompleteAdding();
}
private void ReceivingThread()
{
foreach (DataRecievedEnqeueInfo item in mReceivingThreadQueue.GetConsumingEnumerable())
{
ProcessReceivingItem(item);
}
}
private void SendingThread()
{
foreach (DataSendEnqeueInfo item in mSendingThreadQueue.GetConsumingEnumerable())
{
ProcessSendingItem(item);
}
}
internal void EnqueueRecevingData(DataRecievedEnqeueInfo info)
{
// You can also use TryAdd() if there is a possibility that you
// can add items after you have stopped. Otherwise, this can throw an
// an exception after CompleteAdding() has been called.
mReceivingThreadQueue.Add(info);
}
public void EnqueueSend(DataSendEnqeueInfo info)
{
mSendingThreadQueue.Add(info);
}
As suggested in comments, you also can give a try to the TPL Dataflow blocks.
As far as I can see, you have two similar pipelines, for receive and send, so I assume that your class hierarchy is like this:
class EnqueueInfo { }
class DataRecievedEnqeueInfo : EnqueueInfo { }
class DataSendEnqeueInfo : EnqueueInfo { }
We can assemble an abstract class which will encapsulate the logic for creating the pipeline, and providing the interface for processing the items, like this:
abstract class EnqueueInfoProcessor<T>
where T : EnqueueInfo
{
// here we will store all the messages received before the handling
private readonly BufferBlock<T> _buffer;
// simple action block for actual handling the items
private ActionBlock<T> _action;
// cancellation token to cancel the pipeline
public EnqueueInfoProcessor(CancellationToken token)
{
_buffer = new BufferBlock<T>(new DataflowBlockOptions { CancellationToken = token });
_action = new ActionBlock<T>(item => ProcessItem(item), new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = Environment.ProcessorCount,
CancellationToken = token
});
// we are linking two blocks so all the items from buffer
// will flow down to action block in order they've been received
_buffer.LinkTo(_action, new DataflowLinkOptions { PropagateCompletion = true });
}
public void PostItem(T item)
{
// synchronously wait for posting to complete
_buffer.Post(item);
}
public async Task SendItemAsync(T item)
{
// asynchronously wait for message to be posted
await _buffer.SendAsync(item);
}
// abstract method to implement
protected abstract void ProcessItem(T item);
}
Note that you also can encapsulate the link between two blocks by using the Encapsulate<TInput, TOutput> method, but in that case you have to properly handle the Completion of the buffer block, if you're using it.
After this, we just need to implement two methods for receive and send handle logic:
public class SendEnqueueInfoProcessor : EnqueueInfoProcessor<DataSendEnqeueInfo>
{
SendEnqueueInfoProcessor(CancellationToken token)
: base(token)
{
}
protected override void ProcessItem(DataSendEnqeueInfo item)
{
// send logic here
}
}
public class RecievedEnqueueInfoProcessor : EnqueueInfoProcessor<DataRecievedEnqeueInfo>
{
RecievedEnqueueInfoProcessor(CancellationToken token)
: base(token)
{
}
protected override void ProcessItem(DataRecievedEnqeueInfo item)
{
// recieve logic here
}
}
You also can create more complicated pipeline with TransformBlock<DataRecievedEnqeueInfo, DataSendEnqeueInfo>, if your message flow is about a ReceiveInfo message became SendInfo.

Simple in-memory message queue

Our existing implementation of domain events limits (by blocking) publishing to one thread at a time to avoid reentrant calls to handlers:
public interface IDomainEvent {} // Marker interface
public class Dispatcher : IDisposable
{
private readonly SemaphoreSlim semaphore = new SemaphoreSlim(1, 1);
// Subscribe code...
public void Publish(IDomainEvent domainEvent)
{
semaphore.Wait();
try
{
// Get event subscriber(s) from concurrent dictionary...
foreach (Action<IDomainEvent> subscriber in eventSubscribers)
{
subscriber(domainEvent);
}
}
finally
{
semaphore.Release();
}
}
// Dispose pattern...
}
If a handler publishes an event, this will deadlock.
How can I rewrite this to serialize calls to Publish? In other words, if subscribing handler A publishes event B, I'll get:
Handler A called
Handler B called
while preserving the condition of no reentrant calls to handlers in a multithreaded environment.
I do not want to change the public method signature; there's no place in the application to call a method to publish a queue, for instance.
We came up with a way to do it synchronously.
public class Dispatcher : IDisposable
{
private readonly ConcurrentQueue<IDomainEvent> queue = new ConcurrentQueue<IDomainEvent>();
private readonly SemaphoreSlim semaphore = new SemaphoreSlim(1, 1);
// Subscribe code...
public void Publish(IDomainEvent domainEvent)
{
queue.Enqueue(domainEvent);
if (IsPublishing)
{
return;
}
PublishQueue();
}
private void PublishQueue()
{
IDomainEvent domainEvent;
while (queue.TryDequeue(out domainEvent))
{
InternalPublish(domainEvent);
}
}
private void InternalPublish(IDomainEvent domainEvent)
{
semaphore.Wait();
try
{
// Get event subscriber(s) from concurrent dictionary...
foreach (Action<IDomainEvent> subscriber in eventSubscribers)
{
subscriber(domainEvent);
}
}
finally
{
semaphore.Release();
}
// Necessary, as calls to Publish during publishing could have queued events and returned.
PublishQueue();
}
private bool IsPublishing
{
get { return semaphore.CurrentCount < 1; }
}
// Dispose pattern for semaphore...
}
}
You will have to make Publish asynchronous to achieve that. Naive implementation would be as simple as:
public class Dispatcher : IDisposable {
private readonly BlockingCollection<IDomainEvent> _queue = new BlockingCollection<IDomainEvent>(new ConcurrentQueue<IDomainEvent>());
private readonly CancellationTokenSource _cts = new CancellationTokenSource();
public Dispatcher() {
new Thread(Consume) {
IsBackground = true
}.Start();
}
private List<Action<IDomainEvent>> _subscribers = new List<Action<IDomainEvent>>();
public void AddSubscriber(Action<IDomainEvent> sub) {
_subscribers.Add(sub);
}
private void Consume() {
try {
foreach (var #event in _queue.GetConsumingEnumerable(_cts.Token)) {
try {
foreach (Action<IDomainEvent> subscriber in _subscribers) {
subscriber(#event);
}
}
catch (Exception ex) {
// log, handle
}
}
}
catch (OperationCanceledException) {
// expected
}
}
public void Publish(IDomainEvent domainEvent) {
_queue.Add(domainEvent);
}
public void Dispose() {
_cts.Cancel();
}
}
It can't be done with that interface. You can process the event subscriptions asynchronously to remove the deadlock while still running them serially, but then you can't guarantee the order you described. Another call to Publish might enqueue something (event C) while the handler for event A is running but before it publishes event B. Then event B ends up behind event C in the queue.
As long as Handler A is on equal footing with other clients when it comes to getting an item in the queue, it either has to wait like everyone else (deadlock) or it has to play fairly (first come, first served). The interface you have there doesn't allow the two to be treated differently.
That's not to say you couldn't get up to some shenanigans in your logic to attempt to differentiate them (e.g. based on thread id or something else identifiable), but anything along those lines would unreliable if you don't control the subscriber code as well.

push work to observers

I have a listener, which receives work in the form of IPayload. The listener should push this work to observers who actually do the work. This is my first crude attempt to achieve this:
public interface IObserver
{
void DoWork(IPayload payload);
}
public interface IObservable
{
void RegisterObserver(IObserver observer);
void RemoveObserver(IObserver observer);
void NotifyObservers(IPayload payload);
}
public class Observer : IObserver
{
public void DoWork(IPayload payload)
{
// do some heavy lifting
}
}
public class Listener : IObservable
{
private readonly List<IObserver> _observers = new List<IObserver>();
public void PushIncomingPayLoad(IPayload payload)
{
NotifyObservers(payload);
}
public void RegisterObserver(IObserver observer)
{
_observers.Add(observer);
}
public void RemoveObserver(IObserver observer)
{
_observers.Remove(observer);
}
public void NotifyObservers(IPayload payload)
{
Parallel.ForEach(_observers, observer =>
{
observer.DoWork(payload);
});
}
}
Is this a valid approach that follows the observer/observable pattern (i.e. pub sub?)? My understanding is that the NotifyObservers also spins up a threat for each payload. Is this correct? Any improvement suggestions very much welcome.
Please note that all observers have to finish their work before new work in the form of a payload is passed on to them - the order of 'observation' does not matter. Basically, the listener has to act like a master whilst exploiting the cores of the host as much as possibly using the TPL. IMHO this requires the explicit registration of observers with the listener/Observable.
PS:
I think Parallel.ForEach does not create a thread for each observer: Why isn't Parallel.ForEach running multiple threads? If this is true how can I ensure to create a thread for each observer?
An alternative I have in mind is this:
public async void NotifyObservers(IPayload payload)
{
foreach (var observer in _observers)
{
var observer1 = observer;
await Task.Run(() => observer1.DoWork(payload));
}
await Task.WhenAll();
}
Of course you can do it this way, but in .net that is not needed if you dont want to reinvent the wheel :-)
In c# there this could be done using events.
A short example :
//Your Listener who has a public eventhandler where others can add them as listeners
public class Listener{
//your eventhandler where others "add" them as listeners
public event EventHandler<PayLoadEventsArgs> IncomingPayload;
//a method where you process new data and want to notify the others
public void PushIncomingPayLoad(IPayload payload)
{
//check if there are any listeners
if(IncomingPayload != null)
//if so, then notify them with the data in the PayloadEventArgs
IncomingPayload(this, new PayloadEventArgs(payload));
}
}
//Your EventArgs class to hold the data
public class PayloadEventArgs : EventArgs{
Payload payload { get; private set; }
public PayloadEventArgs(Payload payload){
this.payload = payload;
}
}
public class Worker{
//add this instance as a observer
YourListenerInstance.IncomingPayload += DoWork;
//remove this instance
YourListenerInstance.IncomingPayload -= DoWork;
//This method gets called when the Listener notifies the IncomingPayload listeners
void DoWork(Object sender, PayloadEventArgs e){
Console.WriteLine(e.payload);
}
}
EDIT: As the question asks for parallel execution, how about doing the new thread at the subscriber side? I think this is the easiest approach to achieve this.
//Inside the DoWork method of the subscriber start a new thread
Task.Factory.StartNew( () =>
{
//Do your work here
});
//If you want to make sure that a new thread is used for the task, then add the TaskCreationOptions.LongRunning parameter
Task.Factory.StartNew( () =>
{
//Do your work here
}, TaskCreationOptions.LongRunning);
Hopefully this answers your question? If not, please leave a comment.

Unit testing a background thread with an interface

I have created a class, SenderClass, which will start and run a background worker from its constructor.
The method, RunWorker(), runs is a while(true) loop which will pop elements from a queue, send them through the method SendMessage(), and sleep for a small amount of time to allow new elements to be added to the queue.
Here lies the problem: How do I test the method that sends the element from the queue, without exposing it to those who uses the class?
Implementation:
public class SenderClass : ISenderClass
{
private Queue<int> _myQueue = new Queue<int>();
private Thread _worker;
public SenderClass()
{
//Create a background worker
_worker = new Thread(RunWorker) {IsBackground = true};
_worker.Start();
}
private void RunWorker() //This is the background worker's method
{
while (true) //Keep it running
{
lock (_myQueue) //No fiddling from other threads
{
while (_myQueue.Count != 0) //Pop elements if found
SendMessage(_myQueue.Dequeue()); //Send the element
}
Thread.Sleep(50); //Allow new elements to be inserted
}
}
private void SendMessage(int element)
{
//This is what we want to test
}
public void AddToQueue(int element)
{
Task.Run(() => //Async method will return at ones, not slowing the caller
{
lock (_myQueue) //Lock queue to insert into it
{
_myQueue.Enqueue(element);
}
});
}
}
Wanted interface:
public interface ISenderClass
{
void AddToQueue(int element);
}
Needed interface for test purpose:
public interface ISenderClass
{
void SendMessage(int element);
void AddToQueue(int element);
}
There's a very simple solution, saying I have created my class incorrect due to the Single Responsability Principle, and my class' purpose is not to send messages, but actually run what sends them.
What I should have, is another class, TransmittingClass, which exposes the method SendMessage(int) through its own interface.
This way I can test that class, and SenderClass should just call the method through that interface.
But what other options do I have with the current implementation?
I can make all private methods I wish to test (all of them) have a [assembly:InternalsVisibleTo("MyTests")], but does a third option exist?
Send message logic should be implemented in a separate class with a separate interface. This class should take the new class as a dependency. You can test the new class separately.
public interface IMessageQueue
{
void AddToQueue(int element);
}
public interface IMessageSender
{
void SendMessage(object message);
}
public class SenderClass : IMessageQueue
{
private readonly IMessageSender _sender;
public SenderClass(IMessageSender sender)
{
_sender = sender;
}
public void AddToQueue(int element)
{
/*...*/
}
private void SendMessage()
{
_sender.SendMessage(new object());
}
}
public class DummyMessageSender : IMessageSender
{
//you can use this in your test harness to check for the messages sent
public Queue<object> Messages { get; private set; }
public DummyMessageSender()
{
Messages = new Queue<object>();
}
public void SendMessage(object message)
{
Messages.Enqueue(message);
//obviously you'll need to do some locking here too
}
}
Edit
To address your comment, here is an implementation using Action<int>. This allows you to define your message sending action in your test class to mock the SendMessage method without worrying about creating another class. (Personally, I'd still prefer to define the classes/interfaces explicitly).
public class SenderClass : ISenderClass
{
private Queue<int> _myQueue = new Queue<int>();
private Thread _worker;
private readonly Action<int> _senderAction;
public SenderClass()
{
_worker = new Thread(RunWorker) { IsBackground = true };
_worker.Start();
_senderAction = DefaultMessageSendingAction;
}
public SenderClass(Action<int> senderAction)
{
//Create a background worker
_worker = new Thread(RunWorker) { IsBackground = true };
_worker.Start();
_senderAction = senderAction;
}
private void RunWorker() //This is the background worker's method
{
while (true) //Keep it running
{
lock (_myQueue) //No fiddling from other threads
{
while (_myQueue.Count != 0) //Pop elements if found
SendMessage(_myQueue.Dequeue()); //Send the element
}
Thread.Sleep(50); //Allow new elements to be inserted
}
}
private void SendMessage(int element)
{
_senderAction(element);
}
private void DefaultMessageSendingAction(int item)
{
/* whatever happens during sending */
}
public void AddToQueue(int element)
{
Task.Run(() => //Async method will return at ones, not slowing the caller
{
lock (_myQueue) //Lock queue to insert into it
{
_myQueue.Enqueue(element);
}
});
}
}
public class TestClass
{
private SenderClass _sender;
private Queue<int> _messages;
[TestInitialize]
public void SetUp()
{
_messages = new Queue<int>();
_sender = new SenderClass(DummyMessageSendingAction);
}
private void DummyMessageSendingAction(int item)
{
_messages.Enqueue(item);
}
[TestMethod]
public void TestMethod1()
{
//This isn't a great test, but I think you get the idea
int message = 42;
_sender.AddToQueue(message);
Thread.Sleep(100);
CollectionAssert.Contains(_messages, 42);
}
}
It looks like SenderClass should not perform any sending at all. It should simply maintain the queue. Inject an Action<int> through the constructor that does the sending. That way you can move SendMessage somewhere else and call it however you like.
As an added benefit your test of SendMessage is not cluttered with queue management.
Seeing your edit you don't seem to like this approach and you don't seem to like the InternalsVisibleTo approach either. You could expose SendMessage through a separate interface and implement that interface explicitly. That way SendMessage is still callable through that interface but by default it is not accessible without some casting contortions. It also does not show up in the intellisense autocomplete list.

WinForm asynchronously update UI status from console application call

I want to asynchronously update UI status when doing a long-time task . The program is a console application , however , when I execute the async operations , the UI thread will exit soon after the task begins .
How should I let the UI thread wait when my long-time task finish ?
I simplify my code as below :
public static class Program
{
static void Main()
{
WorkerWrapper wp = new WorkerWrapper();
wp.ProcessData();
}
}
public class WorkerWrapper
{
private RateBar bar;
public void ProcessData()
{
bar = new RateBar();
bar.Show();
Worker wk = new Worker();
wk.WorkProcess += wk_WorkProcess;
Action handler = new Action(wk.DoWork);
var result = handler.BeginInvoke(new AsyncCallback(this.AsyncCallback), handler);
}
private void AsyncCallback(IAsyncResult ar)
{
Action handler = ar.AsyncState as Action;
handler.EndInvoke(ar);
}
private void wk_WorkProcess(object sender, PrecentArgs e)
{
if (e.Precent < 100)
{
bar.Precent = e.Precent;
}
}
}
public class Worker
{
public event EventHandler<PrecentArgs> WorkProcess;
public void DoWork()
{
for (int i = 0; i < 100; i++)
{
WorkProcess(this, new PrecentArgs(i));
Thread.Sleep(100);
}
}
}
public class PrecentArgs : EventArgs
{
public int Precent { get; set; }
public PrecentArgs(int precent)
{
Precent = precent;
}
}
public partial class RateBar : Form
{
public int Precent
{
set
{
System.Windows.Forms.MethodInvoker invoker = () => this.progressBar1.Value = value;
if (this.progressBar1.InvokeRequired)
{
this.progressBar1.Invoke(invoker);
}
else
{
invoker();
}
}
}
public RateBar()
{
InitializeComponent();
}
}
However , in method ProcessData() , if I add result.AsyncWaitHandle.WaitOne() in the end to wait my operation to complete , the Form will freeze .
Is there anything wrong with my way to wait the thread to complete ?
Reason that your application exiting before your "background threads" completed is when there are multiple threads application exists soon after there are not any foreground threads. This is explained more in here http://msdn.microsoft.com/en-us/library/system.threading.thread.isbackground(v=vs.110).aspx
You should add proper waiting mechanisms to your background threads to be completed. There are multiple ways of letting other threads know that the thread is complete. Please refer here. How to wait for thread to finish with .NET?
You shouldn't block the UI thread waiting for the result, but rather retrieve the result from EndInvoke. Your deadlock probably occurs because you are using both result.AsyncWaitHandle.WaitOne() and EndInvoke, both will block until the result is available.
In my opinion the best option is to not call result.AsyncWaitHandle.WaitOne() and just retrieve the result in the AsyncCallback
private void AsyncCallback(IAsyncResult ar)
{
Action handler = ar.AsyncState as Action;
var result = handler.EndInvoke(ar);
}
More information here. Also if you are using .net 4.0 or higher, this sort of thing can be done much easier with async/await.
I write down this solution and hope it may helps others with same question .
The key to this problem is to use a new thread to run RateBar's ShowDialog function .
public void ProcessData()
{
new Thread(() => new RateBar().ShowDialog()).Start();
Worker wk = new Worker();
wk.WorkProcess += wk_WorkProcess;
Action handler = new Action(wk.DoWork);
var result = handler.BeginInvoke(new AsyncCallback(this.AsyncCallback), handler);
}

Categories

Resources