How to make a "fire & forget" async FIFO queue in c#? - c#

I'm trying to process documents asynchronously. The idea is that the user sends documents to a service, which takes time, and will look at the results later (about 20-90 seconds per document).
Ideally, I would like to just fill some kind of observable collection that would be emptied by the system as fast as it can. When there is an item, process it and produce the expected output in another object, and when there is no item just do nothing. When the user checks the output collection, he will find the items that are already processed.
Ideally all items would be visible from the start and would have a state (completed, ongoing or in queue), but once I know how to do the first, I should be able to handle the states.
I'm not sure which object to use for that, right now I'm looking at BlockingCollection but I don't think it's suited for the job, as I can't fill it while it's being emptied from the other end.
private BlockingCollection<IDocument> _jobs = new BlockingCollection<IDocument>();
public ObservableCollection<IExtractedDocument> ExtractedDocuments { get; }
public QueueService()
{
ExtractedDocuments = new ObservableCollection<IExtractedDocument>();
}
public async Task Add(string filePath, List<Extra> extras)
{
if (_jobs.IsAddingCompleted || _jobs.IsCompleted)
_jobs = new BlockingCollection<IDocument>();
var doc = new Document(filePath, extras);
_jobs.Add(doc);
_jobs.CompleteAdding();
await ProcessQueue();
}
private async Task ProcessQueue()
{
foreach (var document in _jobs.GetConsumingEnumerable(CancellationToken.None))
{
var resultDocument = await service.ProcessDocument(document);
ExtractedDocuments.Add(resultDocument );
Debug.WriteLine("Job completed");
}
}
This is how I'm handling it right now. If I remove the CompleteAdding call, it hangs on the second attempt. If I have that statement, then I can't just fill the queue, I have to empty it first which defeats the purpose.
Is there a way of having what I'm trying to achieve? A collection that I would fill and the system would process asynchronously and autonomously?
To summarize, I need :
A collection that I can fill, that would be processed gradually and asynchronously. A document or series or document can be added while some are being processed.
An ouput collection that would be filled after the process is complete
The UI thread and app to still be responsive while everything is running
I don't need to have multiple processes in parallel, or one document at a time. Whichever is easiest to put in place and maintain will do (small scale application). I'm assuming one at a time is simpler.

A common pattern here is to have a callback method that executes upon a document state change. With a background task running, it will chew threw documents as fast as it can. Call Dispose to shutdown the processor.
If you need to process the callback on a gui thread, you'll need to synchornize the callback to your main thread some how. Windows forms has methods to do this if that's what you are using.
This example program implements all the necessary classes and interfaces, and you can fine tune and tweak things as you need.
using System;
using System.Collections.Concurrent;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApp2
{
class Program
{
private static Task Callback(IExtractedDocument doc, DocumentProcessor.DocState docState)
{
Console.WriteLine("Processing doc {0}, state: {1}", doc, docState);
return Task.CompletedTask;
}
public static void Main()
{
using DocumentProcessor docProcessor = new DocumentProcessor(Callback);
Console.WriteLine("Processor started, press any key to end processing");
for (int i = 0; i < 100; i++)
{
if (Console.KeyAvailable)
{
break;
}
else if (i == 5)
{
// make an error
docProcessor.Add(null);
}
else
{
docProcessor.Add(new Document { Text = "Test text " + Guid.NewGuid().ToString() });
}
Thread.Sleep(500);
}
Console.WriteLine("Doc processor shut down, press ENTER to quit");
Console.ReadLine();
}
public interface IDocument
{
public string Text { get; }
}
public class Document : IDocument
{
public string Text { get; set; }
}
public interface IExtractedDocument : IDocument
{
public IDocument OriginalDocument { get; }
public Exception Error { get; }
}
public class ExtractedDocument : IExtractedDocument
{
public override string ToString()
{
return $"Orig text: {OriginalDocument?.Text}, Extracted Text: {Text}, Error: {Error}";
}
public IDocument OriginalDocument { get; set; }
public string Text { get; set; }
public Exception Error { get; set; }
}
public class DocumentProcessor : IDisposable
{
public enum DocState { Processing, Completed, Error }
private readonly BlockingCollection<IDocument> queue = new BlockingCollection<IDocument>();
private readonly Func<IExtractedDocument, DocState, Task> callback;
private CancellationTokenSource cancelToken = new CancellationTokenSource();
public DocumentProcessor(Func<IExtractedDocument, DocState, Task> callback)
{
this.callback = callback;
Task.Run(() => StartQueueProcessor()).GetAwaiter();
}
public void Dispose()
{
if (!cancelToken.IsCancellationRequested)
{
cancelToken.Cancel();
}
}
public void Add(IDocument doc)
{
if (cancelToken.IsCancellationRequested)
{
throw new InvalidOperationException("Processor is disposed");
}
queue.Add(doc);
}
private void ProcessDocument(IDocument doc)
{
try
{
// do processing
DoCallback(new ExtractedDocument { OriginalDocument = doc }, DocState.Processing);
if (doc is null)
{
throw new ArgumentNullException("Document to process was null");
}
IExtractedDocument successExtractedDocument = DoSomeDocumentProcessing(doc);
DoCallback(successExtractedDocument, DocState.Completed);
}
catch (Exception ex)
{
DoCallback(new ExtractedDocument { OriginalDocument = doc, Error = ex }, DocState.Error);
}
}
private IExtractedDocument DoSomeDocumentProcessing(IDocument originalDocument)
{
return new ExtractedDocument { OriginalDocument = originalDocument, Text = "Extracted: " + originalDocument.Text };
}
private void DoCallback(IExtractedDocument result, DocState docState)
{
if (callback != null)
{
// send callbacks in background
callback(result, docState).GetAwaiter();
}
}
private void StartQueueProcessor()
{
try
{
while (!cancelToken.Token.IsCancellationRequested)
{
if (queue.TryTake(out IDocument doc, 1000, cancelToken.Token))
{
// can chance to Task.Run(() => ProcessDocument(doc)).GetAwaiter() for parallel execution
ProcessDocument(doc);
}
}
}
catch (OperationCanceledException)
{
// ignore, don't need to throw or worry about this
}
while (queue.TryTake(out IDocument doc))
{
DoCallback(new ExtractedDocument { Error = new ObjectDisposedException("Processor was disposed") }, DocState.Error);
}
}
}
}
}

Related

C# Design pattern for periodic execution of multiple Threads

I have a below requirement in my C# Windows Service.
At the starting of Service, it fetches a collection of data from db
and keeps it in memory.
Have a business logic to be executed periodically from 3 different threads.
Each thread will execute same bussiness logic with different subset of data from the data collection mentioned in step 1. Each thread will produce different result sets.
All 3 threads will run periodically if any change happened to the data collection.
When any client makes call to the service, service should be able to return the status of the thread execution.
I know C# has different mechanisms to implement periodic thread execution.
Timers, Threads with Sleep, Event eventwaithandle ect.,
I am trying to understand Which threading mechanism or design pattern will be best fit for this requirement?
A more modern approach would be using tasks but have a look at the principles
namespace Test {
public class Program {
public static void Main() {
System.Threading.Thread main = new System.Threading.Thread(() => new Processor().Startup());
main.IsBackground = false;
main.Start();
System.Console.ReadKey();
}
}
public class ProcessResult { /* add your result state */ }
public class ProcessState {
public ProcessResult ProcessResult1 { get; set; }
public ProcessResult ProcessResult2 { get; set; }
public ProcessResult ProcessResult3 { get; set; }
public string State { get; set; }
}
public class Processor {
private readonly object _Lock = new object();
private readonly DataFetcher _DataFetcher;
private ProcessState _ProcessState;
public Processor() {
_DataFetcher = new DataFetcher();
_ProcessState = null;
}
public void Startup() {
_DataFetcher.DataChanged += DataFetcher_DataChanged;
}
private void DataFetcher_DataChanged(object sender, DataEventArgs args) => StartProcessingThreads(args.Data);
private void StartProcessingThreads(string data) {
lock (_Lock) {
_ProcessState = new ProcessState() { State = "Starting", ProcessResult1 = null, ProcessResult2 = null, ProcessResult3 = null };
System.Threading.Thread one = new System.Threading.Thread(() => DoProcess1(data)); // manipulate the data toa subset
one.IsBackground = true;
one.Start();
System.Threading.Thread two = new System.Threading.Thread(() => DoProcess2(data)); // manipulate the data toa subset
two.IsBackground = true;
two.Start();
System.Threading.Thread three = new System.Threading.Thread(() => DoProcess3(data)); // manipulate the data toa subset
three.IsBackground = true;
three.Start();
}
}
public ProcessState GetState() => _ProcessState;
private void DoProcess1(string dataSubset) {
// do work
ProcessResult result = new ProcessResult(); // this object contains the result
// on completion
lock (_Lock) {
_ProcessState = new ProcessState() { State = (_ProcessState.State ?? string.Empty) + ", 1 done", ProcessResult1 = result, ProcessResult2 = _ProcessState?.ProcessResult2, ProcessResult3 = _ProcessState?.ProcessResult3 };
}
}
private void DoProcess2(string dataSubset) {
// do work
ProcessResult result = new ProcessResult(); // this object contains the result
// on completion
lock (_Lock) {
_ProcessState = new ProcessState() { State = (_ProcessState.State ?? string.Empty) + ", 2 done", ProcessResult1 = _ProcessState?.ProcessResult1 , ProcessResult2 = result, ProcessResult3 = _ProcessState?.ProcessResult3 };
}
}
private void DoProcess3(string dataSubset) {
// do work
ProcessResult result = new ProcessResult(); // this object contains the result
// on completion
lock (_Lock) {
_ProcessState = new ProcessState() { State = (_ProcessState.State ?? string.Empty) + ", 3 done", ProcessResult1 = _ProcessState?.ProcessResult1, ProcessResult2 = _ProcessState?.ProcessResult2, ProcessResult3 = result };
}
}
}
public class DataEventArgs : System.EventArgs {
// data here is string, but could be anything -- just think of thread safety when accessing from the 3 processors
private readonly string _Data;
public DataEventArgs(string data) {
_Data = data;
}
public string Data => _Data;
}
public class DataFetcher {
// watch for data changes and fire when data has changed
public event System.EventHandler<DataEventArgs> DataChanged;
}
}
The simplest solution would be to define the scheduled logic in Task Method() style, and execute them using Task.Run(), while in the main thread just wait for the execution to finish using Task.WaitAny(). If a task is finished, you could Call Task.WaitAny again, but instead of the finished task, you'd pass Task.Delay(timeUntilNextSchedule).
This way the tasks are not blocking the main thread, and you can avoid spinning the CPU just to wait. In general, you can avoid managing directly in modern .NET
Depending on other requirements, like standardized error handling, monitoring capability, management of these scheduled task, you could also rely on a more robust solution, like HangFire.

Partitioning with IEnumerable source

I have ConcurrentQueue of type IProducerConsumerCollection i.e.
IProducerConsumerCollection<Job> _queue = new ConcurrentQueue<Job>();
and producer method which adds jobs to _queue and consumer method which processes the Job from _queue. Now in the consumer method, I like to process the jobs concurrently. Below is code for sample class with producer and consumer methods:
public class TestQueue
{
IProducerConsumerCollection<Job> _queue = new ConcurrentQueue<Job>();
private static HttpClient _client = new HttpClient();
public TestQueue()
{
WorkProducerThread();
WorkConsumerThread();
}
public void WorkConsumerThread()
{
if (_queue.Count > 0)
{
//At this point, 4 partitions are created but all records are in 1st partition only; 2,3,4 partition are empty
var partitioner = Partitioner.Create(_queue).GetPartitions(4);
Task t = Task.WhenAll(
from partition in partitioner
select Task.Run(async () =>
{
using (partition)
{
while (partition.MoveNext())
await CreateJobs(partition.Current);
}
}));
t.Wait();
//At this point, queue count is still 20, how to remove item from _queue collection when processed?
}
}
private async Task CreateJobs(Job job)
{
HttpContent bodyContent = null;
await _client.PostAsync("job", bodyContent);
}
public void WorkProducerThread()
{
if (_queue.Count == 0)
{
try
{
for (int i = 0; i < 20; i++)
{
Job job = new Job { Id = i, JobName = "j" + i.ToString(), JobCreated = DateTime.Now };
_queue.TryAdd(job);
}
}
catch (Exception ex)
{
//_Log.Error("Exception while adding jobs to collection", ex);
}
}
}
}
public class Job
{
public int Id { get; set; }
public string JobName { get; set; }
public DateTime JobCreated { get; set; }
}
There are 2 problems,
Partitioner.Create(_queue).GetPartitions(4); Partitioner.GetPartions creates 4 partitions but all records are in 1st partition only; 2,3,4 partition are empty. I can't find, why this is happening? Ideally, all 4 partitions should have 5 records each (since total 20 records are in queue). I read this article from MSDN on partitioning but didn't get any clue. Also I checked the partitioning example from this article.
Also, I want to remove the item from _queue after processing in consumer method and there is only one way _queue.TryTake method to remove item. I don't know how to remove item along with partitioning?
I can consider any alternate way to achieve the same result.
Thanks in advance.
Partitioner.Create(_queue).GetPartitions(4); Partitioner.GetPartions
creates 4 partitions but all records are in 1st partition only; 2,3,4
partition are empty.
This is not correct, your queue entries are being partitioned correctly. To verify, change your processing logic slightly to log the partition that is doing the work:
Task t = Task.WhenAll(
from partition in partitioner.Select((jobs, i) => new { jobs, i })
select Task.Run(async () =>
{
using (partition.jobs)
{
while (partition.jobs.MoveNext())
{
Console.WriteLine(partition.i);
await CreateJobs(partition.jobs.Current);
}
}
}));
You will notice that the Console.WriteLine will write values from 0 to 3 - indicating that they are being partitioned correctly.
Also, I want to remove the item from _queue after processing in
consumer method and there is only one way _queue.TryTake method to
remove item. I don't know how to remove item along with partitioning?
You can achieve that with a slight rewrite. The main changes are switching to BlockingCollection and by adding this NuGet package to give access to GetConsumingPartitioner.
Give this a try:
using System;
using System.Collections.Concurrent;
using System.Linq;
using System.Net.Http;
using System.Threading.Tasks;
namespace Test
{
public class TestQueue
{
BlockingCollection<Job> _queue = new BlockingCollection<Job>();
private static HttpClient _client = new HttpClient();
public TestQueue()
{
WorkProducerThread();
WorkConsumerThread();
}
public void WorkConsumerThread()
{
if (!_queue.IsCompleted)
{
//At this point, 4 partitions are created but all records are in 1st partition only; 2,3,4 partition are empty
var partitioner = _queue.GetConsumingPartitioner().GetPartitions(4);
Task t = Task.WhenAll(
from partition in partitioner
select Task.Run(async () =>
{
using (partition)
{
while (partition.MoveNext())
await CreateJobs(partition.Current);
}
}));
t.Wait();
Console.WriteLine(_queue.Count);
}
}
private async Task CreateJobs(Job job)
{
//HttpContent bodyContent = null;
//await _client.PostAsync("job", bodyContent);
await Task.Delay(100);
}
public void WorkProducerThread()
{
if (_queue.Count == 0)
{
try
{
for (int i = 0; i < 20; i++)
{
Job job = new Job { Id = i, JobName = "j" + i.ToString(), JobCreated = DateTime.Now };
_queue.TryAdd(job);
}
_queue.CompleteAdding();
}
catch (Exception ex)
{
//_Log.Error("Exception while adding jobs to collection", ex);
}
}
}
}
public class Job
{
public int Id { get; set; }
public string JobName { get; set; }
public DateTime JobCreated { get; set; }
}
class Program
{
static void Main(string[] args)
{
var g = new TestQueue();
Console.ReadLine();
}
}
}

How to implement a continuous producer-consumer pattern inside a Windows Service

Here's what I'm trying to do:
Keep a queue in memory of items that need processed (i.e. IsProcessed = 0)
Every 5 seconds, get unprocessed items from the db, and if they're not already in the queue, add them
Continuous pull items from the queue, process them, and each time an item is processed, update it in the db (IsProcessed = 1)
Do this all "as parallel as possible"
I have a constructor for my service like
public MyService()
{
Ticker.Elapsed += FillQueue;
}
and I start that timer when the service starts like
protected override void OnStart(string[] args)
{
Ticker.Enabled = true;
Task.Run(() => { ConsumeWork(); });
}
and my FillQueue is like
private static async void FillQueue(object source, ElapsedEventArgs e)
{
var items = GetUnprocessedItemsFromDb();
foreach(var item in items)
{
if(!Work.Contains(item))
{
Work.Enqueue(item);
}
}
}
and my ConsumeWork is like
private static void ConsumeWork()
{
while(true)
{
if(Work.Count > 0)
{
var item = Work.Peek();
Process(item);
Work.Dequeue();
}
else
{
Thread.Sleep(500);
}
}
}
However this is probably a naive implementation and I'm wondering whether .NET has any type of class that is exactly what I need for this type of situation.
Though #JSteward' answer is a good start, you can improve it with mixing up the TPL-Dataflow and Rx.NET extensions, as a dataflow block may easily become an observer for your data, and with Rx Timer it will be much less effort for you (Rx.Timer explanation).
We can adjust MSDN article for your needs, like this:
private const int EventIntervalInSeconds = 5;
private const int DueIntervalInSeconds = 60;
var source =
// sequence of Int64 numbers, starting from 0
// https://msdn.microsoft.com/en-us/library/hh229435.aspx
Observable.Timer(
// fire first event after 1 minute waiting
TimeSpan.FromSeconds(DueIntervalInSeconds),
// fire all next events each 5 seconds
TimeSpan.FromSeconds(EventIntervalInSeconds))
// each number will have a timestamp
.Timestamp()
// each time we select some items to process
.SelectMany(GetItemsFromDB)
// filter already added
.Where(i => !_processedItemIds.Contains(i.Id));
var action = new ActionBlock<Item>(ProcessItem, new ExecutionDataflowBlockOptions
{
// we can start as many item processing as processor count
MaxDegreeOfParallelism = Environment.ProcessorCount,
});
IDisposable subscription = source.Subscribe(action.AsObserver());
Also, your check for item being already processed isn't quite accurate, as there is a possibility to item get selected as unprocessed from db right at the time you've finished it's processing, yet didn't update it in database. In this case item will be removed from Queue<T>, and after that added there again by producer, this is why I've added the ConcurrentBag<T> to this solution (HashSet<T> isn't thread-safe):
private static async Task ProcessItem(Item item)
{
if (_processedItemIds.Contains(item.Id))
{
return;
}
_processedItemIds.Add(item.Id);
// actual work here
// save item as processed in database
// we need to wait to ensure item not to appear in queue again
await Task.Delay(TimeSpan.FromSeconds(EventIntervalInSeconds * 2));
// clear the processed cache to reduce memory usage
_processedItemIds.Remove(item.Id);
}
public class Item
{
public Guid Id { get; set; }
}
// temporary cache for items in process
private static ConcurrentBag<Guid> _processedItemIds = new ConcurrentBag<Guid>();
private static IEnumerable<Item> GetItemsFromDB(Timestamped<long> time)
{
// log event timing
Console.WriteLine($"Event # {time.Value} at {time.Timestamp}");
// return items from DB
return new[] { new Item { Id = Guid.NewGuid() } };
}
You can implement cache clean up in other way, for example, start a "GC" timer, which will remove processed items from cache on regular basis.
To stop events and processing items you should Dispose the subscription and, maybe, Complete the ActionBlock:
subscription.Dispose();
action.Complete();
You can find more information about Rx.Net in their guidelines on github.
You could use an ActionBlock to do your processing, it has a built in queue that you can post work to. You can read up on tpl-dataflow here: Intro to TPL-Dataflow also Introduction to Dataflow, Part 1. Finally, this is a quick sample to get you going. I've left out a lot but it should at least get you started.
using System;
using System.Threading;
using System.Threading.Tasks;
using System.Threading.Tasks.Dataflow;
namespace MyWorkProcessor {
public class WorkProcessor {
public WorkProcessor() {
Processor = CreatePipeline();
}
public async Task StartProcessing() {
try {
await Task.Run(() => GetWorkFromDatabase());
} catch (OperationCanceledException) {
//handle cancel
}
}
private CancellationTokenSource cts {
get;
set;
}
private ITargetBlock<WorkItem> Processor {
get;
}
private TimeSpan DatabasePollingFrequency {
get;
} = TimeSpan.FromSeconds(5);
private ITargetBlock<WorkItem> CreatePipeline() {
var options = new ExecutionDataflowBlockOptions() {
BoundedCapacity = 100,
CancellationToken = cts.Token
};
return new ActionBlock<WorkItem>(item => ProcessWork(item), options);
}
private async Task GetWorkFromDatabase() {
while (!cts.IsCancellationRequested) {
var work = await GetWork();
await Processor.SendAsync(work);
await Task.Delay(DatabasePollingFrequency);
}
}
private async Task<WorkItem> GetWork() {
return await Context.GetWork();
}
private void ProcessWork(WorkItem item) {
//do processing
}
}
}

How can I improve and/or modularize my handling of event based tasks?

So I have a server and I'm making calls to it through a wrapped up WebSocket (WebSocket4Net) and one of the requirements of the library I'm building is the ability to await on the return of the request. So I have a class MessageEventHandler that contains events that are triggered by the class MessageHandler as messages come in.
MessageEventHandler ex.
public class MessageEventHandler : IMessageEventHandler
{
public delegate void NodeNameReceived(string name);
public event Interfaces.NodeNameReceived OnNodeNameReceived;
public void NodeNameReceive(string name)
{
if (this.OnNodeNameReceived != null)
{
this.OnNodeNameReceived(name);
}
}
}
MessageHandler ex.
public class MessageHandler : IMessageHandler
{
private IMessageEventHandler eventHandler;
public MessageHandler(IMessageEventHandler eventHandler)
{
this.eventHandler = eventHandler;
}
public void ProcessDataCollectorMessage(string message)
{
var serviceMessage = JsonConvert.DeserializeObject<ServiceMessage>(message);
switch (message.MessageType)
{
case MessageType.GetNodeName:
{
var nodeName = serviceMessage.Data as string;
if (nodeName != null)
{
this.eventHandler.NodeNameReceive(nodeName);
}
break;
}
default:
{
throw new NotImplementedException();
}
}
}
Now building upon those classes I have the class containing my asynchronous function that handles the call to get the node name.
public class ClientServiceInterface : IClientServiceInterface
{
public delegate void RequestReady(ServiceMessage serviceMessage);
public event Interfaces.RequestReady OnRequestReady;
public int ResponseTimeout { get; private set; }
private IMessageEventHandler messageEventHandler;
public ClientServiceInterface(IMessageEventHandler messageEventHandler, int responseTimeout = 5000)
{
this.messageEventHandler = messageEventHandler;
this.ResponseTimeout = responseTimeout;
}
public Task<string> GetNodeNameAsync()
{
var taskCompletionSource = new TaskCompletionSource<string>();
var setHandler = default(NodeNameReceived);
setHandler = name =>
{
taskCompletionSource.SetResult(name);
this.messageEventHandler.OnNodeNameReceived -= setHandler;
};
this.messageEventHandler.OnNodeNameReceived += setHandler;
var ct = new CancellationTokenSource(this.ResponseTimeout);
var registration = new CancellationTokenRegistration();
registration = ct.Token.Register(
() =>
{
taskCompletionSource.TrySetCanceled();
this.messageEventHandler.OnNodeNameReceived -= setHandler;
registration.Dispose();
},
false);
var serviceMessage = new ServiceMessage() { Type = MessageType.GetNodeName };
this.ReadyMessage(serviceMessage);
return taskCompletionSource.Task;
}
}
As you can see I wouldn't call it pretty and I apologize if anyone threw up a little reading it. But this is my first attempt at wrapping a Task with Asynchronous Event. So with that on the table I could use some help.
Is there a better way to accomplish what I'm trying to achieve here? Remembering that I want a user of the library to either subscribe to the event and listen for all callbacks OR they can simply await the return depending on
their needs.
var nodeName = await GetNodeNameAsync();
Console.WriteLine(nodeName);
or
messageEventHandler.OnNodeNameReceived += (name) => Console.WriteLine(name);
GetNodeNameAsync();
Alternatively if my approach is actually 'good' can anyone provide any advice as to how I can write a helper function to abstract out setting up each function in this way? Any help would be greatly appreciated.
So I've written a couple classes to solve the problem I was having. The first of which is my CallbackHandle class which contains the task inside the TaskCompletionSource so each time that a request is made in my example a new callback handle is created.
public class CallbackHandle<T>
{
public CallbackHandle(int timeout)
{
this.TaskCompletionSource = new TaskCompletionSource<T>();
var cts = new CancellationTokenSource(timeout);
cts.Token.Register(
() =>
{
if (this.Cancelled != null)
{
this.Cancelled();
}
});
this.CancellationToken = cts;
}
public event Action Cancelled;
public CancellationTokenSource CancellationToken { get; private set; }
public TaskCompletionSource<T> TaskCompletionSource { get; private set; }
}
Then I have a 'handler' that manages the handles and their creation.
public class CallbackHandler<T>
{
private readonly IList<CallbackHandle<T>> callbackHandles;
private readonly object locker = new object();
public CallbackHandler()
{
this.callbackHandles = new List<CallbackHandle<T>>();
}
public CallbackHandle<T> AddCallback(int timeout)
{
var callback = new CallbackHandle<T>(timeout);
callback.Cancelled += () =>
{
this.callbackHandles.Remove(callback);
callback.TaskCompletionSource.TrySetResult("Error");
};
lock (this.locker)
{
this.callbackHandles.Add(callback);
}
return callback;
}
public void EventTriggered(T eventArgs)
{
lock (this.locker)
{
if (this.callbackHandles.Count > 0)
{
CallbackHandle<T> callback =
this.callbackHandles.First();
if (callback != null)
{
this.callbackHandles.Remove(callback);
callback.TaskCompletionSource.SetResult(eventArgs);
}
}
}
}
}
This is a simplified version of my actual implementation but it should get someone started if they need something similar. So to use this on my ClientServiceInterface class in my example I would start by creating a class level handler and using it like this:
public class ClientServiceInterface : IClientServiceInterface
{
private readonly CallbackHandler<string> getNodeNameHandler;
public ClientServiceInterface(IMessageEventHandler messageEventHandler, int responseTimeout = 5000)
{
this.messageEventHandler = messageEventHandler;
this.ResponseTimeout = responseTimeout;
this.getNodeNameHandler = new
CallbackHandler<string>();
this.messageEventHandler.OnNodeNameReceived += args => this.getNodeNameHandler.EventTriggered(args);
}
public Task<string> GetNodeNameAsync()
{
CallbackHandle<string> callbackHandle = this.getNodeNameHandler.AddCallback(this.ResponseTimeout);
var serviceMessage = new ServiceMessage
{
Type = MessageType.GetNodeName.ToString()
};
this.ReadyMessage(serviceMessage);
return callbackHandle.TaskCompletionSource.Task;
}
// Rest of class declaration removed for brevity
}
Which is much better looking than what I had before (at least in my opinion) and it's easy to extend.
For starters follow a thread-safe pattern:
public void NodeNameReceive(string name)
{
var evt = this.OnNodeNameReceived;
if (evt != null)
{
evt (name);
}
}
If you do not take a reference to the event object it can be set to null between the time you check null and call the method.

Catch exceptions in async loading of dialog viewmodel

I have a DialogViewModel class with async Task LoadData() method. This method loads data asynchronously and shows this dialog, which notifies user about loading. Here is the code:
try
{
var dialog = new DialogViewModel();
var loadTask = dialog.LoadData();
WindowManager.ShowDialog(dialog);
await loadTask;
}
catch (Exception ex)
{
Logger.Error("Error in DialogViewModel", ex);
// Notify user about the error
}
When LoadData throws an exception, it isn't handled until user exits the dialog. It happens because exception is handled when calling await, and it's not happening until WindowManager.ShowDialog(dialog) completes.
What is the correct way to show a dialog with async loading? I've tried this ways:
Call LoadData() in OnShow(), constructor or similar. But this won't work if I'll need to show this dialog without any data
Call await LoadData() before showing the dialog. This way user have to wait for data to load before actually seeing the window, but I want the window to show up instantly with a loading indicator.
Why is there an explicit public LoadData method?
If this has to happen then do it inside the constructor asynchronously using Task<T> with a ContinueWith to process any exception generated by checking the IsFaultedproperty on the returned task.
This would address both issues you've highlighted.
A very simple example is shown below, obivously you're implementation will be more complicated.
public class DialogViewModel
{
private Task _task;
public DialogViewModel()
{
var context = TaskScheduler.FromCurrentSynchronizationContext();
_task = Task.Factory.StartNew(() =>
{
var data = GetDataCollection();
return data;
})
.ContinueWith(t =>
{
if (t.IsFaulted)
{
HasErrored = true;
ErrorMessage = "It's borked!";
}
else
{
Data = t.Result;
}
}, context);
}
public IEnumerable<string> Data { get; private set; }
public bool HasErrored { get; private set; }
public string ErrorMessage { get; private set; }
private static IEnumerable<string> GetDataCollection()
{
return new List<string>()
{
"John",
"Jack",
"Steve"
};
}
}
Or if you don't want to use Task<T> explicitly and want to use async\await functionality you could use a slightly different approach because you can't use async\await with a class constructor:
public class DialogViewModel
{
public IEnumerable<string> Data { get; private set; }
public bool HasErrored { get; private set; }
public string ErrorMessage { get; private set; }
async public static Task<DialogViewModel> BuildViewModelAsync()
{
try
{
var data = await GetDataCollection();
return new DialogViewModel(data);
}
catch (Exception)
{
return new DialogViewModel("Failed!");
}
}
private DialogViewModel(IEnumerable<string> data)
{
Data = data;
}
private DialogViewModel(string errorMessage)
{
HasErrored = true;
ErrorMessage = errorMessage;
}
private async static Task<IEnumerable<string>> GetDataCollection()
{
// do something async...
return await Task.Factory.StartNew(() => new List<string>()
{
"John",
"Jack",
"Steve"
});
}
}

Categories

Resources