How to wait if event received before some action in C#? - c#

I have a class in which a method is called on the basis of events received from external application.
public void ProcessItems(Store id,Items items)
{
//Some logic
UpdateValidItems(id,validItems)
}
public void UpdateValidItems(Store id,Items items)
{
//Save in DB
}
The external application may invoke "ProcessItems" while UpdateValidItems is being processed. I want that if UpdateValidItems is being processed and event invoked during UpdateValidItems processing then it should wait until UpdateValidItems completed. Is there any way to do it?
Also, multiple stores can be processed at a time. So it should wait for store based only. If storeId is different then it should not wait.

I'd decouple incoming Events and processing:
Have a thread wait on a Blocking Queue
Event writes to Blocking Queue
Thread from 0.) gets notified, dequeues 1 "Row" (Id and Items)
Said Thread processes the Items
Thread waits again OR if meanwhile the Event added more Rows: Process until queue is empty, then wait again.
This ensures that:
Only one Store is mutated at a time
Event returns fast
Following events for same store do not interfere with current processing.
You may also have a look into DataFlow to implement a similar approach.
Edit / Basic Example:
public class Handler
{
private readonly BlockingCollection<QueueEntry> _queue = new BlockingCollection<QueueEntry>();
private readonly CancellationTokenSource _cts = new CancellationTokenSource();
// I used a Form with a button to simulate events, so you'll have to adapt that..
public Handler(Form1 parent)
{
// register for incoming Items
parent.NewItems += Parent_NewItems;
// Start processing on a long running Pool-Thread
Task.Factory.StartNew(QueueWorker, TaskCreationOptions.LongRunning);
}
// Stop Processing
public void Shutdown( bool doitnow )
{
// Mark the queue "complete" - adding is now forbidden.
_queue.CompleteAdding();
// If you want to stop NOW, cancel all operations
if (doitnow ) { _cts.Cancel(); }
// Else the Task will run until the queue has been processed.
}
// This is all that happens on the EDT / Main / UI Thread
private void Parent_NewItems(object sender, NewItemsEventArgs e)
{
try
{
_queue.Add(new QueueEntry { Sender = sender, Event = e });
}
catch (InvalidOperationException)
{
// dontcare ? I didn't - You may, though.
// Will be thrown if the queue has been marked complete.
}
}
private async Task QueueWorker()
{
// While the queue has not been marked complete and is empty
while (!_queue.IsCompleted)
{
QueueEntry entry = null;
try
{
// Wait until an entry is available or until canceled.
entry = _queue.Take(_cts.Token);
}
catch ( OperationCanceledException )
{
// dontcare
}
if (entry != null)
{
await Process(entry, _cts.Token);
}
}
}
private async Task Process(QueueEntry entry, CancellationToken cancel)
{
// Dummy Processing...
await Task.Delay(TimeSpan.FromSeconds(entry.Event.Items), cancel);
}
}
public class QueueEntry
{
public object Sender { get; set; }
public NewItemsEventArgs Event { get; set; }
}
Of, course, this can be tuned to allow for some concurrency / parallel processing.

private object lock_object = new object();
public void ProcessItems(Store id,Items items)
{
//Some logic
lock(lock_object)
{
UpdateValidItems(id,validItems)
}
}

Related

UWP Update UI From Async Worker

I am trying to implement a long-running background process, that periodically reports on its progress, to update the UI in a UWP app. How can I accomplish this? I have seen several helpful topics, but none have all of the pieces, and I have been unable to put them all together.
For example, consider a user who picks a very large file, and the app is reading in and/or operating on the data in the file. The user clicks a button, which populates a list stored on the page with data from the file the user picks.
PART 1
The page and button's click event handler look something like this:
public sealed partial class MyPage : Page
{
public List<DataRecord> DataRecords { get; set; }
private DateTime LastUpdate;
public MyPage()
{
this.InitializeComponent();
this.DataRecords = new List<DataRecord>();
this.LastUpdate = DateTime.Now;
// Subscribe to the event handler for updates.
MyStorageWrapper.MyEvent += this.UpdateUI;
}
private async void LoadButton_Click(object sender, RoutedEventArgs e)
{
StorageFile pickedFile = // … obtained from FileOpenPicker.
if (pickedFile != null)
{
this.DataRecords = await MyStorageWrapper.GetDataAsync(pickedFile);
}
}
private void UpdateUI(long lineCount)
{
// This time check prevents the UI from updating so frequently
// that it becomes unresponsive as a result.
DateTime now = DateTime.Now;
if ((now - this.LastUpdate).Milliseconds > 3000)
{
// This updates a textblock to display the count, but could also
// update a progress bar or progress ring in here.
this.MessageTextBlock.Text = "Count: " + lineCount;
this.LastUpdate = now;
}
}
}
Inside of the MyStorageWrapper class:
public static class MyStorageWrapper
{
public delegate void MyEventHandler(long lineCount);
public static event MyEventHandler MyEvent;
private static void RaiseMyEvent(long lineCount)
{
// Ensure that something is listening to the event.
if (MyStorageWrapper.MyEvent!= null)
{
// Call the listening event handlers.
MyStorageWrapper.MyEvent(lineCount);
}
}
public static async Task<List<DataRecord>> GetDataAsync(StorageFile file)
{
List<DataRecord> recordsList = new List<DataRecord>();
using (Stream stream = await file.OpenStreamForReadAsync())
{
using (StreamReader reader = new StreamReader(stream))
{
while (!reader.EndOfStream)
{
string line = reader.ReadLine();
// Does its parsing here, and constructs a single DataRecord …
recordsList.Add(dataRecord);
// Raises an event.
MyStorageWrapper.RaiseMyEvent(recordsList.Count);
}
}
}
return recordsList;
}
}
The code for the time check I got from following this.
As written, this code makes the app unresponsive with a large file (I tested on a text file on the order of about 8.5 million lines). I thought adding async and await to the GetDataAsync() call would prevent this? Does this not do its work on a thread aside from the UI thread? Through Debug mode in Visual Studio, I have verified the program is progressing as expected... it is just tying up the UI thread, making the app unresponsive (see this page from Microsoft about the UI thread and asynchronous programming).
PART 2
I have successfully implemented before an asynchronous, long-running process that runs on a separate thread AND still updates the UI periodically... but this solution does not allow for the return value - specifically the line from PART 1 that says:
this.DataRecords = await MyStorageWrapper.GetDataAsync(pickedFile);
My previous, successful implementation follows (most of the bodies cut out for brevity). Is there a way to adapt this to allow for return values?
In a Page class:
public sealed partial class MyPage : Page
{
public Generator MyGenerator { get; set; }
public MyPage()
{
this.InitializeComponent();
this.MyGenerator = new Generator();
}
private void StartButton_Click(object sender, RoutedEventArgs e)
{
this.MyGenerator.ProgressUpdate += async (s, f) => await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, delegate ()
{
// Updates UI elements on the page from here.
}
this.MyGenerator.Start();
}
private void StopButton_Click(object sender, RoutedEventArgs e)
{
this.MyGenerator.Stop();
}
}
And in the Generator class:
public class Generator
{
private CancellationTokenSource cancellationTokenSource;
public event EventHandler<GeneratorStatus> ProgressUpdate;
public Generator()
{
this.cancellationTokenSource = new CancellationTokenSource();
}
public void Start()
{
Task task = Task.Run(() =>
{
while(true)
{
// Throw an Operation Cancelled exception if the task is cancelled.
this.cancellationTokenSource.Token.ThrowIfCancellationRequested();
// Does stuff here.
// Finally raise the event (assume that 'args' is the correct args and datatypes).
this.ProgressUpdate.Raise(this, new GeneratorStatus(args));
}
}, this.cancellationTokenSource.Token);
}
public void Stop()
{
this.cancellationTokenSource.Cancel();
}
}
Finally, there are two supporting classes for the ProgressUpdate event:
public class GeneratorStatus : EventArgs
{
// This class can contain a handful of properties; only one shown.
public int number { get; private set; }
public GeneratorStatus(int n)
{
this.number = n;
}
}
static class EventExtensions
{
public static void Raise(this EventHandler<GeneratorStatus> theEvent, object sender, GeneratorStatus args)
{
theEvent?.Invoke(sender, args);
}
}
It is key to understand that async/await does not directly say the awaited code will run on a different thread. When you do await GetDataAsync(pickedFile); the execution enters the GetDataAsync method still on the UI thread and continues there until await file.OpenStreamForReadAsync() is reached - and this is the only operation that will actually run asynchronously on a different thread (as file.OpenStreamForReadAsync is actually implemented this way).
However, once OpenStreamForReadAsync is completed (which will be really quick), await makes sure the execution returns to the same thread it started on - which means UI thread. So the actual expensive part of your code (reading the file in while) runs on UI thread.
You could marginally improve this by using reader.ReadLineAsync, but still, you will be returning to UI thread after each await.
ConfigureAwait(false)
The first trick you want to introduce to resolve this problem is ConfigureAwait(false).
Calling this on an asynchronous call tells the runtime that the execution does not have to return to the thread that originally called the asynchronous method - hence this can avoid returning execution to the UI thread. Great place to put it in your case is OpenStreamForReadAsync and ReadLineAsync calls:
public static async Task<List<DataRecord>> GetDataAsync(StorageFile file)
{
List<DataRecord> recordsList = new List<DataRecord>();
using (Stream stream = await file.OpenStreamForReadAsync().ConfigureAwait(false))
{
using (StreamReader reader = new StreamReader(stream))
{
while (!reader.EndOfStream)
{
string line = await reader.ReadLineAsync().ConfigureAwait(false);
// Does its parsing here, and constructs a single DataRecord …
recordsList.Add(dataRecord);
// Raises an event.
MyStorageWrapper.RaiseMyEvent(recordsList.Count);
}
}
}
return recordsList;
}
Dispatcher
Now you freed up your UI thread, but introduced yet another problem with the progress reporting. Because now MyStorageWrapper.RaiseMyEvent(recordsList.Count) runs on a different thread, you cannot update the UI in the UpdateUI method directly, as accessing UI elements from non-UI thread throws synchronization exception. Instead, you must use UI thread Dispatcher to make sure the code runs on the right thread.
In the constructor get reference to the UI thread Dispatcher:
private CoreDispatcher _dispatcher;
public MyPage()
{
this.InitializeComponent();
_dispatcher = Window.Current.Dispatcher;
...
}
Reason to do it ahead is that Window.Current is again accessible only from the UI thread, but the page constructor definitely runs there, so it is the ideal place to use.
Now rewrite UpdateUI as follows
private async void UpdateUI(long lineCount)
{
await _dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
{
// This time check prevents the UI from updating so frequently
// that it becomes unresponsive as a result.
DateTime now = DateTime.Now;
if ((now - this.LastUpdate).Milliseconds > 3000)
{
// This updates a textblock to display the count, but could also
// update a progress bar or progress ring in here.
this.MessageTextBlock.Text = "Count: " + lineCount;
this.LastUpdate = now;
}
});
}

Producer/ Consumer pattern using threads and EventWaitHandle

I guess it is sort of a code review, but here is my implementation of the producer / consumer pattern. What I would like to know is would there be a case in which the while loops in the ReceivingThread() or SendingThread() methods might stop executing. Please note that EnqueueSend(DataSendEnqeueInfo info) is called from multiple different threads and I probably can't use tasks here since I definitely have to consume commands in a separate thread.
private Thread mReceivingThread;
private Thread mSendingThread;
private Queue<DataRecievedEnqeueInfo> mReceivingThreadQueue;
private Queue<DataSendEnqeueInfo> mSendingThreadQueue;
private readonly object mReceivingQueueLock = new object();
private readonly object mSendingQueueLock = new object();
private bool mIsRunning;
EventWaitHandle mRcWaitHandle;
EventWaitHandle mSeWaitHandle;
private void ReceivingThread()
{
while (mIsRunning)
{
mRcWaitHandle.WaitOne();
DataRecievedEnqeueInfo item = null;
while (mReceivingThreadQueue.Count > 0)
{
lock (mReceivingQueueLock)
{
item = mReceivingThreadQueue.Dequeue();
}
ProcessReceivingItem(item);
}
mRcWaitHandle.Reset();
}
}
private void SendingThread()
{
while (mIsRunning)
{
mSeWaitHandle.WaitOne();
while (mSendingThreadQueue.Count > 0)
{
DataSendEnqeueInfo item = null;
lock (mSendingQueueLock)
{
item = mSendingThreadQueue.Dequeue();
}
ProcessSendingItem(item);
}
mSeWaitHandle.Reset();
}
}
internal void EnqueueRecevingData(DataRecievedEnqeueInfo info)
{
lock (mReceivingQueueLock)
{
mReceivingThreadQueue.Enqueue(info);
mRcWaitHandle.Set();
}
}
public void EnqueueSend(DataSendEnqeueInfo info)
{
lock (mSendingQueueLock)
{
mSendingThreadQueue.Enqueue(info);
mSeWaitHandle.Set();
}
}
P.S the idea here is that am using WaitHandles to put thread to sleep when the queue is empty, and signal them to start when new items are enqueued.
UPDATE
I am just going to leave this https://blogs.msdn.microsoft.com/benwilli/2015/09/10/tasks-are-still-not-threads-and-async-is-not-parallel/ ,for people who might be trying to implement Producer/Consumer pattern using TPL or tasks.
Use a BlockingCollection instead of Queue, EventWaitHandle and lock objects:
public class DataInfo { }
private Thread mReceivingThread;
private Thread mSendingThread;
private BlockingCollection<DataInfo> queue;
private CancellationTokenSource receivingCts = new CancellationTokenSource();
private void ReceivingThread()
{
try
{
while (!receivingCts.IsCancellationRequested)
{
// This will block until an item is added to the queue or the cancellation token is cancelled
DataInfo item = queue.Take(receivingCts.Token);
ProcessReceivingItem(item);
}
}
catch (OperationCanceledException)
{
}
}
internal void EnqueueRecevingData(DataInfo info)
{
// When a new item is produced, just add it to the queue
queue.Add(info);
}
// To cancel the receiving thread, cancel the token
private void CancelReceivingThread()
{
receivingCts.Cancel();
}
Personally, for simple producer-consumer problems, I would just use BlockingCollection. There would be no need to manually code your own synchronization logic. The consuming threads will also block if there are no items present in the queue.
Here is what your code might look like if you use this class:
private BlockingCollection<DataRecievedEnqeueInfo> mReceivingThreadQueue = new BlockingCollection<DataRecievedEnqeueInfo>();
private BlockingCollection<DataSendEnqeueInfo> mSendingThreadQueue = new BlockingCollection<DataSendEnqeueInfo>();
public void Stop()
{
// No need for mIsRunning. Makes the enumerables in the GetConsumingEnumerable() calls
// below to complete.
mReceivingThreadQueue.CompleteAdding();
mSendingThreadQueue.CompleteAdding();
}
private void ReceivingThread()
{
foreach (DataRecievedEnqeueInfo item in mReceivingThreadQueue.GetConsumingEnumerable())
{
ProcessReceivingItem(item);
}
}
private void SendingThread()
{
foreach (DataSendEnqeueInfo item in mSendingThreadQueue.GetConsumingEnumerable())
{
ProcessSendingItem(item);
}
}
internal void EnqueueRecevingData(DataRecievedEnqeueInfo info)
{
// You can also use TryAdd() if there is a possibility that you
// can add items after you have stopped. Otherwise, this can throw an
// an exception after CompleteAdding() has been called.
mReceivingThreadQueue.Add(info);
}
public void EnqueueSend(DataSendEnqeueInfo info)
{
mSendingThreadQueue.Add(info);
}
As suggested in comments, you also can give a try to the TPL Dataflow blocks.
As far as I can see, you have two similar pipelines, for receive and send, so I assume that your class hierarchy is like this:
class EnqueueInfo { }
class DataRecievedEnqeueInfo : EnqueueInfo { }
class DataSendEnqeueInfo : EnqueueInfo { }
We can assemble an abstract class which will encapsulate the logic for creating the pipeline, and providing the interface for processing the items, like this:
abstract class EnqueueInfoProcessor<T>
where T : EnqueueInfo
{
// here we will store all the messages received before the handling
private readonly BufferBlock<T> _buffer;
// simple action block for actual handling the items
private ActionBlock<T> _action;
// cancellation token to cancel the pipeline
public EnqueueInfoProcessor(CancellationToken token)
{
_buffer = new BufferBlock<T>(new DataflowBlockOptions { CancellationToken = token });
_action = new ActionBlock<T>(item => ProcessItem(item), new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = Environment.ProcessorCount,
CancellationToken = token
});
// we are linking two blocks so all the items from buffer
// will flow down to action block in order they've been received
_buffer.LinkTo(_action, new DataflowLinkOptions { PropagateCompletion = true });
}
public void PostItem(T item)
{
// synchronously wait for posting to complete
_buffer.Post(item);
}
public async Task SendItemAsync(T item)
{
// asynchronously wait for message to be posted
await _buffer.SendAsync(item);
}
// abstract method to implement
protected abstract void ProcessItem(T item);
}
Note that you also can encapsulate the link between two blocks by using the Encapsulate<TInput, TOutput> method, but in that case you have to properly handle the Completion of the buffer block, if you're using it.
After this, we just need to implement two methods for receive and send handle logic:
public class SendEnqueueInfoProcessor : EnqueueInfoProcessor<DataSendEnqeueInfo>
{
SendEnqueueInfoProcessor(CancellationToken token)
: base(token)
{
}
protected override void ProcessItem(DataSendEnqeueInfo item)
{
// send logic here
}
}
public class RecievedEnqueueInfoProcessor : EnqueueInfoProcessor<DataRecievedEnqeueInfo>
{
RecievedEnqueueInfoProcessor(CancellationToken token)
: base(token)
{
}
protected override void ProcessItem(DataRecievedEnqeueInfo item)
{
// recieve logic here
}
}
You also can create more complicated pipeline with TransformBlock<DataRecievedEnqeueInfo, DataSendEnqeueInfo>, if your message flow is about a ReceiveInfo message became SendInfo.

Simple in-memory message queue

Our existing implementation of domain events limits (by blocking) publishing to one thread at a time to avoid reentrant calls to handlers:
public interface IDomainEvent {} // Marker interface
public class Dispatcher : IDisposable
{
private readonly SemaphoreSlim semaphore = new SemaphoreSlim(1, 1);
// Subscribe code...
public void Publish(IDomainEvent domainEvent)
{
semaphore.Wait();
try
{
// Get event subscriber(s) from concurrent dictionary...
foreach (Action<IDomainEvent> subscriber in eventSubscribers)
{
subscriber(domainEvent);
}
}
finally
{
semaphore.Release();
}
}
// Dispose pattern...
}
If a handler publishes an event, this will deadlock.
How can I rewrite this to serialize calls to Publish? In other words, if subscribing handler A publishes event B, I'll get:
Handler A called
Handler B called
while preserving the condition of no reentrant calls to handlers in a multithreaded environment.
I do not want to change the public method signature; there's no place in the application to call a method to publish a queue, for instance.
We came up with a way to do it synchronously.
public class Dispatcher : IDisposable
{
private readonly ConcurrentQueue<IDomainEvent> queue = new ConcurrentQueue<IDomainEvent>();
private readonly SemaphoreSlim semaphore = new SemaphoreSlim(1, 1);
// Subscribe code...
public void Publish(IDomainEvent domainEvent)
{
queue.Enqueue(domainEvent);
if (IsPublishing)
{
return;
}
PublishQueue();
}
private void PublishQueue()
{
IDomainEvent domainEvent;
while (queue.TryDequeue(out domainEvent))
{
InternalPublish(domainEvent);
}
}
private void InternalPublish(IDomainEvent domainEvent)
{
semaphore.Wait();
try
{
// Get event subscriber(s) from concurrent dictionary...
foreach (Action<IDomainEvent> subscriber in eventSubscribers)
{
subscriber(domainEvent);
}
}
finally
{
semaphore.Release();
}
// Necessary, as calls to Publish during publishing could have queued events and returned.
PublishQueue();
}
private bool IsPublishing
{
get { return semaphore.CurrentCount < 1; }
}
// Dispose pattern for semaphore...
}
}
You will have to make Publish asynchronous to achieve that. Naive implementation would be as simple as:
public class Dispatcher : IDisposable {
private readonly BlockingCollection<IDomainEvent> _queue = new BlockingCollection<IDomainEvent>(new ConcurrentQueue<IDomainEvent>());
private readonly CancellationTokenSource _cts = new CancellationTokenSource();
public Dispatcher() {
new Thread(Consume) {
IsBackground = true
}.Start();
}
private List<Action<IDomainEvent>> _subscribers = new List<Action<IDomainEvent>>();
public void AddSubscriber(Action<IDomainEvent> sub) {
_subscribers.Add(sub);
}
private void Consume() {
try {
foreach (var #event in _queue.GetConsumingEnumerable(_cts.Token)) {
try {
foreach (Action<IDomainEvent> subscriber in _subscribers) {
subscriber(#event);
}
}
catch (Exception ex) {
// log, handle
}
}
}
catch (OperationCanceledException) {
// expected
}
}
public void Publish(IDomainEvent domainEvent) {
_queue.Add(domainEvent);
}
public void Dispose() {
_cts.Cancel();
}
}
It can't be done with that interface. You can process the event subscriptions asynchronously to remove the deadlock while still running them serially, but then you can't guarantee the order you described. Another call to Publish might enqueue something (event C) while the handler for event A is running but before it publishes event B. Then event B ends up behind event C in the queue.
As long as Handler A is on equal footing with other clients when it comes to getting an item in the queue, it either has to wait like everyone else (deadlock) or it has to play fairly (first come, first served). The interface you have there doesn't allow the two to be treated differently.
That's not to say you couldn't get up to some shenanigans in your logic to attempt to differentiate them (e.g. based on thread id or something else identifiable), but anything along those lines would unreliable if you don't control the subscriber code as well.

WinForm asynchronously update UI status from console application call

I want to asynchronously update UI status when doing a long-time task . The program is a console application , however , when I execute the async operations , the UI thread will exit soon after the task begins .
How should I let the UI thread wait when my long-time task finish ?
I simplify my code as below :
public static class Program
{
static void Main()
{
WorkerWrapper wp = new WorkerWrapper();
wp.ProcessData();
}
}
public class WorkerWrapper
{
private RateBar bar;
public void ProcessData()
{
bar = new RateBar();
bar.Show();
Worker wk = new Worker();
wk.WorkProcess += wk_WorkProcess;
Action handler = new Action(wk.DoWork);
var result = handler.BeginInvoke(new AsyncCallback(this.AsyncCallback), handler);
}
private void AsyncCallback(IAsyncResult ar)
{
Action handler = ar.AsyncState as Action;
handler.EndInvoke(ar);
}
private void wk_WorkProcess(object sender, PrecentArgs e)
{
if (e.Precent < 100)
{
bar.Precent = e.Precent;
}
}
}
public class Worker
{
public event EventHandler<PrecentArgs> WorkProcess;
public void DoWork()
{
for (int i = 0; i < 100; i++)
{
WorkProcess(this, new PrecentArgs(i));
Thread.Sleep(100);
}
}
}
public class PrecentArgs : EventArgs
{
public int Precent { get; set; }
public PrecentArgs(int precent)
{
Precent = precent;
}
}
public partial class RateBar : Form
{
public int Precent
{
set
{
System.Windows.Forms.MethodInvoker invoker = () => this.progressBar1.Value = value;
if (this.progressBar1.InvokeRequired)
{
this.progressBar1.Invoke(invoker);
}
else
{
invoker();
}
}
}
public RateBar()
{
InitializeComponent();
}
}
However , in method ProcessData() , if I add result.AsyncWaitHandle.WaitOne() in the end to wait my operation to complete , the Form will freeze .
Is there anything wrong with my way to wait the thread to complete ?
Reason that your application exiting before your "background threads" completed is when there are multiple threads application exists soon after there are not any foreground threads. This is explained more in here http://msdn.microsoft.com/en-us/library/system.threading.thread.isbackground(v=vs.110).aspx
You should add proper waiting mechanisms to your background threads to be completed. There are multiple ways of letting other threads know that the thread is complete. Please refer here. How to wait for thread to finish with .NET?
You shouldn't block the UI thread waiting for the result, but rather retrieve the result from EndInvoke. Your deadlock probably occurs because you are using both result.AsyncWaitHandle.WaitOne() and EndInvoke, both will block until the result is available.
In my opinion the best option is to not call result.AsyncWaitHandle.WaitOne() and just retrieve the result in the AsyncCallback
private void AsyncCallback(IAsyncResult ar)
{
Action handler = ar.AsyncState as Action;
var result = handler.EndInvoke(ar);
}
More information here. Also if you are using .net 4.0 or higher, this sort of thing can be done much easier with async/await.
I write down this solution and hope it may helps others with same question .
The key to this problem is to use a new thread to run RateBar's ShowDialog function .
public void ProcessData()
{
new Thread(() => new RateBar().ShowDialog()).Start();
Worker wk = new Worker();
wk.WorkProcess += wk_WorkProcess;
Action handler = new Action(wk.DoWork);
var result = handler.BeginInvoke(new AsyncCallback(this.AsyncCallback), handler);
}

Create And Execute Tasks in A Row

I am developing an application in which i want to execute the tasks from only 1 place so that every time i add new Task it is added to that row to be executed, Also i want a priority for each task so if i set the task priority to HIGH it is added to the top of the row so it is executed immediately, On the other hand if i set the priority to Low it is added to the end of the row and so on...
I thought about using Tasks and ContinueWith but i don't have any clue from where should i start to have a class that totally handles my needs.
I am sorry for not providing a code or something bug i hope someone can get the point that i am pointing to and help me. And thank you in advance .
Well, if you didn't need to make room for high-priority tasks, you could make a simple helper class using Task and ContinueWith:
public class SimpleWorkQueue
{
private Task _main = null;
public void AddTask(Action task)
{
if (_main == null)
{
_main = new Task(task);
_main.Start();
}
else
{
Action<Task> next = (t) => task();
_main = _main.ContinueWith(next);
}
}
}
If you do need high-priority tasks, you probably need to handle more stuff yourself. Here is a producer/consumer example where all incoming tasks are inserted into a list in AddTask(), and a single worker thread consumes tasks from that list:
public class PrioritizedWorkQueue
{
List<Action> _queuedWork;
object _queueLocker;
Thread _workerThread;
public PrioritizedWorkQueue()
{
_queueLocker = new object();
_queuedWork = new List<Action>();
_workerThread = new Thread(LookForWork);
_workerThread.IsBackground = true;
_workerThread.Start();
}
private void LookForWork()
{
while (true)
{
Action work;
lock (_queueLocker)
{
while (!_queuedWork.Any()) { Monitor.Wait(_queueLocker); }
work = _queuedWork.First();
_queuedWork.RemoveAt(0);
}
work();
}
}
public void AddTask(Action task, bool highPriority)
{
lock (_queueLocker)
{
if (highPriority)
{
_queuedWork.Insert(0, task);
}
else
{
_queuedWork.Add(task);
}
Monitor.Pulse(_queueLocker);
}
}
}

Categories

Resources