I have to load some object from Azure. I planned to load them five by five in order to take less time (kind of lazy loading) :
ProductManager.LoadProduct call my ProductAccess.LoadProduct
This last method load product from azure, and then raise an event in order that the manager can get the product. Then if it receive product, it call again ProductAccess.LoadProduct.
etc ....
Can't use await/async because it's cross platform code (monodroid/monotouch stable version still don't have await/async).
The first load is correct, then the second call works, but seems my task is not executed (the start do not execute my second task...). I check the thread number and the second time, the Task.Factory.StartNew(() => is executed on the main thread. I try to fix it by specify a long running but still doesn't work.
Here is my code :
Manager side :
public void LoadProduct()
{
ProductAccess.LoadProductsAsync()
}
public void receiveProductsAsync(Object pa, EventArgs e)
{
if (((ProductEventArgs)e).GetAttribute.Equals("LoadProductsAsync"))
{
IoC.Resolve<IProductAccess>().RequestEnded -= receiveProductsAsync;
if ( ((ProductEventArgs)e).LP).Count() != 0)
LoadProductsAsync();
Products = Products.Concat(((ProductEventArgs)e).LP).ToList();
if (((ProductEventArgs)e).E != null)
{
if (RequestEnded != null)
RequestEnded(this, new OperationEventArgs() { Result = false, E = ((ProductEventArgs)e).E, GetAttribute = "LoadProductsAsync" });
}
else
{
if (RequestEnded != null)
{
RequestEnded(this, new OperationEventArgs() { Result = true, GetAttribute = "LoadProductsAsync" });
}
}
}
}
Access side :
public void LoadProductsAsync()
{
Task<ProductEventArgs>.Factory.StartNew(() =>
{
var longRunningTask = new Task<ProductEventArgs>(() =>
{
try
{
var _items = this.table.Select(x => new Product(.....)).Skip(nbrProductLoaded).Take(nbrProductToLoadAtEachTime).ToListAsync().Result;
this.nbrProductLoaded += _items.Count();
Task.Factory.StartNew(() => synchronizeFavorite(_items));
return new ProductEventArgs() { LP = _items, GetAttribute = "LoadProductsAsync" };
}
catch (Exception e)
{
return new ProductEventArgs() { E = e, GetAttribute = "LoadProductsAsync" };
}
}, TaskCreationOptions.LongRunning);
longRunningTask.Start();
if (longRunningTask.Wait(timeout))
return longRunningTask.Result;
return new ProductEventArgs() { E = new Exception("timed out"), GetAttribute = "LoadProductsAsync" };
}, TaskCreationOptions.LongRunning).ContinueWith((x) => {
handleResult(x.Result);
}, TaskScheduler.FromCurrentSynchronizationContext());
}
Task.Factory.StartNew will by default use the current TaskScheduler.
The first time, you're not running inside a task, so the current TaskScheduler is the default TaskScheduler (which will run on the thread pool).
When you schedule handleResult back to the original context by using TaskScheduler.FromCurrentSynchronizationContext, that will run handleResult within a task on the main thread. So in that context, the current TaskScheduler is the UI TaskScheduler, not the default TaskScheduler.
To fix this, explicitly pass TaskScheduler.Default to any StartNew that you want to run on a thread pool thread (and remove LongRunning).
Related
I am trying to wait for the end of a task. This task loads a bunch of things from the server and usually takes about 2 to 3 seconds. This task is started by the command LoadItemsCommand. Once the task is over, it should check if anything has been loaded, and if any of the loaded items already contain values. If yes, it should remove a button from the Toolbar.
But it doesn't wait.
The command call and specifically ask to await the task, but on the main page, things procede without waiting for the end of the task. i tried to put a loop with 30 second wait time to wait for the command to work, but nothing.
Here is the command in the viewmodel:
LoadItemsCommand = new Command(async () => await ExecuteLoadItemsCommand());
Here is the LoadItems Function in the ViewModel
async Task ExecuteLoadItemsCommand()
{
if (IsBusy)
return;
IsBusy = true;
List<string> stockIds = new List<string>();
foreach (var orderline in this._order)
{
stockIds.Add(orderline.StockId);
}
string[] array = new string[stockIds.Count];
stockIds.CopyTo(array, 0);
this._products = await Common.GetProducts(new ProductSearchFilter { StockId = array });
_scanStart = Common.Now;
Items.Clear();
bool already_scanned = false; // Todo trouver un meilleur non car pas tres explicite
foreach (PickingOrderLineToHandle orderLine in _order)
{
if (orderLine.ScannedQuantity < orderLine.Quantity)
{
if ((bool)orderLine.WasSentToEnd && !already_scanned)
{
Items.Add(new ListItem() { Name = " ----- ", Id = "-1" });
already_scanned = true;
}
Product product = null;
foreach (var prod in _products)
{
if (prod.StockId == orderLine.StockId)
{
product = prod;
break;
}
}
Items.Add(new ListItem() { Name = $"{(orderLine.Quantity - orderLine.ScannedQuantity).ToString()} / {orderLine.Quantity}\t + {orderLine.Table_bin.LabelAddress} {product.ProductName} {orderLine.stock.Platform}", Id = orderLine.StockId });
}
}
IsBusy = false;
}
And here is the call in the page:
protected override void OnAppearing()
{
base.OnAppearing();
viewModel.LoadItemsCommand.Execute(null);
int loop = 0;
while (viewModel.IsBusy && loop < 60)
{
System.Threading.Thread.Sleep(500);
loop++;
};
if (loop == 60)
{
DisplayAlert("Erreur", "Une erreur est survenue lors du chargment. Veuillez réessayer.", "Ok");
Navigation.PopAsync();
}
var cantCancel = viewModel.Items.Any(i => i.BookedQuantity > 0);
if (Common.IsTeamLeader)
cantCancel = false;
if (cantCancel)
{
var cancelButton = this.ToolbarItems.Where(b => b.Text == "Annuler").First();
this.ToolbarItems.Remove(cancelButton);
}
}
Problem
The following statement seems like a blocking call, but actually, since the method that is called by the Command is an asynchronous anonymous function, it will just execute in a fire and forget fashion and will not be awaited:
viewModel.LoadItemsCommand.Execute(null);
That's because of how the Command is defined:
LoadItemsCommand = new Command(async () => await ExecuteLoadItemsCommand());
A Command like this always executes synchronously. However, when calling an async void method from within a synchronous context, it will just begin execution, but won't await the method (hence fire and forget).
Solution
Now, the way you're using the Command is unusual. Normally, you would bind to the Command from a UI control or you would just trigger its execution. You are expecting it to finish before the rest of the code is executed, which isn't the case as I've explained above.
Instead of executing the Command like that, you could instead do the following:
Make sure ExecuteLoadItemsCommand() is returning a Task
Change the visibility of ExecuteLoadItemsCommand() to public
Change the name to ExecuteLoadItemsAsync() which is a common notation for methods that return asynchronous Tasks
Your new method signature should look something like this:
public async Task ExecuteLoadItemsAsync()
{
//...
}
Then, in your page's code-behind, you could change the OnAppearing() method override to run asynchronously by adding the async keyword to the signature and then awaiting the ExecuteLoadItemsAsync() method:
protected override async void OnAppearing()
{
base.OnAppearing();
await viewModel.ExecuteLoadItemsAsync();
//...
}
This way, you're executing and awaiting the method directly. The Command can still be used for any UI elements to bind to, such as buttons.
Optional improvement
In your ViewModel, you could use an AsyncRelayCommand instead of Command like this:
private AsyncRelayCommand _loadItemsCommand;
public AsyncRelayCommand LoadItemsCommand => _loadItemsCommand ??= new AsyncRelayCommand(ExecuteLoadItemsAsync);
However, this doesn't change the fact that you shouldn't execute the Command in your page's code behind the way you've been attempting.
I have an issue with an endpoint blocking calls from other endpoints in my app. When we call this endpoint, this basically blocks all other api calls from executing, and they need to wait until this is finished.
public async Task<ActionResult> GrantAccesstoUsers()
{
// other operations
var grantResult = await
this._workSpaceProvider.GrantUserAccessAsync(this.CurrentUser.Id).ConfigureAwait(false);
return this.Ok(result);
}
The GrantUserAccessAsync method calls set of tasks that will run on a parallel.
public async Task<List<WorkspaceDetail>> GrantUserAccessAsync(string currentUser)
{
var responselist = new List<WorkspaceDetail>();
try
{
// calling these prematurely to be reused once threads are created
// none expensive calls
var properlyNamedWorkSpaces = await this._helper.GetProperlyNamedWorkspacesAsync(true).ConfigureAwait(false);
var dbGroups = await this._reportCatalogProvider.GetWorkspaceFromCatalog().ConfigureAwait(false);
var catalogInfo = await this._clientServiceHelper.GetDatabaseConfigurationAsync("our-service").ConfigureAwait(false);
if (properlyNamedWorkSpaces != null && properlyNamedWorkSpaces.Count > 0)
{
// these methods returns tasks for parallel processing
var grantUserContributorAccessTaskList = await this.GrantUserContributorAccessTaskList(properlyNamedWorkSpaces, currentUser, dbGroups, catalogInfo).ConfigureAwait(false);
var grantUserAdminAccessTaskList = await this.GrantUserAdminAccessTaskList(properlyNamedWorkSpaces, currentUser, dbGroups, catalogInfo).ConfigureAwait(false);
var removeInvalidUserAndSPNTaskList = await this.RemoveAccessRightsToWorkspaceTaskList(properlyNamedWorkSpaces, dbGroups, currentUser, catalogInfo).ConfigureAwait(false);
var tasklist = new List<Task<WorkspaceDetail>>();
tasklist.AddRange(grantUserContributorAccessTaskList);
tasklist.AddRange(grantUserAdminAccessTaskList);
tasklist.AddRange(removeInvalidUserAndSPNTaskList);
// Start running Parallel Task
Parallel.ForEach(tasklist, task =>
{
Task.Delay(this._config.CurrentValue.PacingDelay);
task.Start();
});
// Get All Client Worspace Processing Results
var clientWorkspaceProcessingResult = await Task.WhenAll(tasklist).ConfigureAwait(false);
// Populate result
responselist.AddRange(clientWorkspaceProcessingResult.ToList());
}
}
catch (Exception)
{
throw;
}
return responselist;
}
These methods are basically identical in structure and they look like this:
private async Task<List<Task<WorkspaceDetail>>> GrantUserContributorAccessTaskList(List<Group> workspaces, string currentUser, List<WorkspaceManagement> dbGroups, DatabaseConfig catalogInfo)
{
var tasklist = new List<Task<WorkspaceDetail>>();
foreach (var workspace in workspaces)
{
tasklist.Add(new Task<WorkspaceDetail>(() =>
this.GrantContributorAccessToUsers(workspace, currentUser, dbGroups, catalogInfo).Result));
// i added a delay here because we encountered an issue before in production and this seems to solve the problem. this is set to 4ms.
Task.Delay(this._config.CurrentValue.DelayInMiliseconds);
}
return tasklist;
}
The other methods called here looks like this:
private async Task<WorkspaceDetail> GrantContributorAccessToUsers(Group workspace, string currentUser, List<Data.ReportCatalogDB.WorkspaceManagement> dbGroups, DatabaseConfig catalogInfo)
{
// This prevents other thread or task to start and prevents exceeding the number of threads allowed
await this._batchProcessor.WaitAsync().ConfigureAwait(false);
var result = new WorkspaceDetail();
try
{
var contributorAccessresult = await this.helper.GrantContributorAccessToUsersAsync(workspace, this._powerBIConfig.CurrentValue.SPNUsers).ConfigureAwait(false);
if (contributorAccessresult != null
&& contributorAccessresult.Count > 0)
{
// do something
}
else
{
// do something
}
// this is done to reuse the call that is being executed in the helper above. it's an expensive call from an external endpoint so we opted to reuse what was used in the initial call, instead of calling it again for this process
var syncWorkspaceAccessToDb = await this.SyncWorkspaceAccessAsync(currentUser, workspace.Id, contributorAccessresult, dbGroups, catalogInfo).ConfigureAwait(false);
foreach (var dbResponse in syncWorkspaceAccessToDb) {
result.ResponseMessage += dbResponse.ResponseMessage;
}
}
catch (Exception ex)
{
this._loghelper.LogEvent(this._logger, logEvent, OperationType.GrantContributorAccessToWorkspaceManager, LogEventStatus.FAIL);
}
finally
{
this._batchProcessor.Release();
}
return result;
}
The last method called writes the record in a database table:
private async Task<List<WorkspaceDetail>> SyncWorkspaceAccessAsync(string currentUser,
Guid workspaceId,
List<GroupUser> groupUsers,
List<WorkspaceManagement> dbGroups,
DatabaseConfig catalogInfo) {
var result = new List<WorkspaceDetail>();
var tasklist = new List<Task<WorkspaceDetail>>();
// get active workspace details from the db
var workspace = dbGroups.Where(x => x.PowerBIGroupId == workspaceId).FirstOrDefault();
try
{
// to auto dispose the provider, we are creating this for each instance because
// having only one instance creates an error when the other task starts running
using (var contextProvider = this._contextFactory.GetReportCatalogProvider(
catalogInfo.Server,
catalogInfo.Database,
catalogInfo.Username,
catalogInfo.Password,
this._dbPolicy))
{
if (workspace != null)
{
// get current group users in the db from the workspace object
var currentDbGroupUsers = workspace.WorkspaceAccess.Where(w => w.Id == workspace.Id
&& w.IsDeleted == false).ToList();
#region identify to process
#region users to add
// identify users to add
var usersToAdd = groupUsers.Where(g => !currentDbGroupUsers.Any(w => w.Id == workspace.Id ))
.Select(g => new WorkspaceAccess
{
// class properties
}).ToList();
#endregion
var addTasks = await this.AddWorkspaceAccessToDbTask(catalogProvider, usersToAdd, workspace.PowerBIGroupId, workspace.WorkspaceName).ConfigureAwait(false);
tasklist.AddRange(addTasks);
// this is a potential fix that i did, hoping adding another parallel thread can solve the problem
Parallel.ForEach(tasklist, new ParallelOptions { MaxDegreeOfParallelism = this._config.CurrentValue.MaxDegreeOfParallelism }, task =>
{
Task.Delay(this._config.CurrentValue.PacingDelay);
task.Start();
});
var processResult = await Task.WhenAll(tasklist).ConfigureAwait(false);
// Populate result
result.AddRange(processResult.ToList());
}
}
}
catch (Exception ex)
{
// handle error
}
return result;
}
I tried some potential solutions already, like the methods here are written with Task.FromResult before instead of async so I changed that. Reference is from this thread:
Using Task.FromResult v/s await in C#
Also, I thought it was a similar issue that we faced before when we are creating multiple db context connections needed when running multiple parallel tasks by adding a small delay on tasks but that didn't solve the problem.
Task.Delay(this._config.CurrentValue.DelayInMiliseconds);
Any help would be much appreciated.
I assume your this._batchProcessor is an instance of SemaphoreSlim. If your other endpoints somehow call
await this._batchProcessor.WaitAsyc()
that means they can't go further until semaphor will be released.
Another thing I'd like to mention: please avoid using Parallel.ForEach with async/await. TPL is not designed to work with async/await, here is good answer why you should avoid using them together: Nesting await in Parallel.ForEach
I want to use that load some in backgound , But Jobsystem probably use the main thread , So how to do use jobsystem Wituhot Freezes , is it impossible ??
Or I just use C# Thread?? Dont Use JobSystem ??
struct SleepJob : IJobParallelFor
{
public void Execute(int index)
{
Debug.LogFormat("[SleepJob.Execute] Thread Id {0}", Thread.CurrentThread.ManagedThreadId);
Thread.Sleep(1);
}
}
struct SleepJob2 : IJobParallelFor
{
public void Execute(int index)
{
Debug.LogFormat("[SleepJob2.Execute] Thread Id {0}", Thread.CurrentThread.ManagedThreadId);
Thread.Sleep(1);
}
}
[ContextMenu("JobSleep")]
public void JobSleep()
{
Debug.LogFormat("[JobSleep.Execute] Thread Id {0}", Thread.CurrentThread.ManagedThreadId);
SleepJob job = new SleepJob() { };
SleepJob2 job2 = new SleepJob2() { };
JobHandle jh = job.Schedule(100, 64);
JobHandle jh2 = job2.Schedule(100, 64, jh);
JobHandle.ScheduleBatchedJobs();
jh2.Complete(); // freezes
Debug.LogFormat("[JobSleep.Execute] jh.Complete();");
}
Note that Complete makes your execution freeze because:
The JobSystem automatically prioritizes the job and any of its dependencies to run first in the queue, then attempts to execute the job itself on the thread which calls the Complete function.
I think this is not the method you want to use.
Rather try waiting for the Job to complete in a Coroutine like e.g.
[ContextMenu("JobSleep")]
public void JobSleep()
{
Debug.LogFormat("[JobSleep.Execute] Thread Id {0}", Thread.CurrentThread.ManagedThreadId);
SleepJob job = new SleepJob() { };
SleepJob2 job2 = new SleepJob2() { };
JobHandle jh = job.Schedule(100, 64);
JobHandle jh2 = job2.Schedule(100, 64, jh);
JobHandle.ScheduleBatchedJobs();
StartCoroutine(WaitFor(jh2));
}
IEnumerator WaiFor(JobHandle job)
{
yield return new WaitUntil(() => job.IsComplete);
Debug.LogFormat("[JobSleep.Execute] job IsComplete");
}
Unfortunately you didn't add the code you actually want to be executed in background.
In general maybe simply using a Thread or async might already solve your problem.
e.g. with a Thread
// for sending back responses to the main thread
private ConcurrentQueue<Action> callbacks = new ConcurrentQueue<Action>;
// work the callbacks in the main thread
private void Update()
{
while(callbacks.Count > 0)
{
Action callback;
if(callbacks.TryDequeue(out callback)) callback?.Invoke();
}
}
// optional callback on success
public void StartBackgroundTask(Action onSuccess = null)
{
var thread = new Thread(new ParameterizedThreadStart(TheTaskThatTakesAWhile);
// pass in parameters here
thread.Start(onSuccess);
}
// Will be running in a background thread
private void TheTaskThatTakesAWhile(Action onSuccess)
{
// hand this back to the main thread
callbacks.Enqueue(() => Debug.Log("Long task started ..."));
// TODO whatever takes so long
// Note btw that sleep is in milliseconds!
Thread.Sleep(1000);
hand this back to the mainthread
callbacks.Enqueue(() =>
{
Debug.Log("Long task started ..."));
onSuccess?.Invoke();
}
}
e.g. using async (which internally also runs in a Thread)
private async Task TheTaskThatTakesAWhile()
{
// do whatever takes long here
}
// usually you should avoid having async void
// but when including an Action as callback this is fine
public async void StartBackgroundTask(Action onSuccess = null)
{
await TheTaskThatTakesAWhile();
onSuccess?.Invoke();
}
Both you could call like e.g.
StartBackgroundTask(() => {
// what should happen when done?
Debug.Log("I'm done!");
});
Make JobHandle jh2 global variable. Check JobHandle.isComplete in Update method or in Coroutine
In my application there are three threads like:
private Thread _analysisThread;
private Thread _head2HeadThread;
private Thread _formThread;
and each thread is started in the following way:
if (_analysisThread == null || !_analysisThread.IsAlive)
{
_analysisThread = new Thread(() => { Analysis.Logic(match); });
_analysisThread.Start();
}
I've a ListView where the user can select an item and then start again the thread, but I want prevent this 'cause the methods inside each thread are heavy, so need time to complete them.
Until now I want disable the ListView selection, so I did:
<ListView IsEnabled="{Binding IsMatchListEnabled}">
private bool _isMatchListEnabled = true;
public bool IsMatchListEnabled
{
get { return _isMatchListEnabled; }
set
{
_isMatchListEnabled = value;
OnPropertyChanged();
}
}
before a new Thread start I do: IsMatchListEnabled = false; but what I need to do is check if all thread are finished and then do: IsMatchListEnabled = true;, actually if I enable the ListView after all thread, I get the ListView even enabled 'cause the Thread code is async, and the code outside the Thread is sync, so actually this property is useless.
What I tried to avoid this is create an infinite loop like this:
while (true)
{
if (!_analysisThread.IsAlive && !_head2HeadThread.IsAlive && !_formThread.IsAlive)
{
IsMatchListEnabled = true;
break;
}
}
this loop is placed after all threads execution, but as you can imagine, this will freeze the application.
Any solution?
All comments are correct — it's better to use Tasks. Just to answer OP's question.
You can synchronize threads with ManualResetEvent, having an array of events by the number of threads and one additional thread to change IsMatchListEnabled when all threads are finished.
public static void SomeThreadAction(object id)
{
var ev = new ManualResetEvent(false);
events[id] = ev; // store the event somewhere
Thread.Sleep(2000 * (int)id); // do your work
ev.Set(); // set the event signaled
}
Then, somewhere else we need to initialize waiting routine.
// we need tokens to be able to cancel waiting
var cts = new CancellationTokenSource();
var ct = cts.Token;
Task.Factory.StartNew(() =>
{
bool completed = false;
while (!ct.IsCancellationRequested && !completed)
{
// will check if our routine is cancelled each second
completed =
WaitHandle.WaitAll(
events.Values.Cast<ManualResetEvent>().ToArray(),
TimeSpan.FromSeconds(1));
}
if (completed) // if not completed, then somebody cancelled our routine
; // change your variable here
});
Complete example can be found and viewed here.
I would suggest using Microsoft's Reactive Framework for this. It's more powerful than tasks and the code is far simpler than using threads.
Let's say you have 3 long-running operations:
Action huey = () => { Console.WriteLine("Huey Start"); Thread.Sleep(5000); Console.WriteLine("Huey Done"); };
Action dewey = () => { Console.WriteLine("Dewey Start"); Thread.Sleep(5000); Console.WriteLine("Dewey Done"); };
Action louie = () => { Console.WriteLine("Louie Start"); Thread.Sleep(5000); Console.WriteLine("Louie Done"); };
Now you can write the following simple query:
IObservable<Unit> query =
from a in new [] { huey, dewey, louie }.ToObservable()
from u in Observable.Start(() => a())
select u;
You run it like this:
Stopwatch sw = Stopwatch.StartNew();
IDisposable subscription = query.Subscribe(u => { }, () =>
{
Console.WriteLine("All Done in {0} seconds.", sw.Elapsed.TotalSeconds);
});
The results I get are:
Huey Start
Dewey Start
Louie Start
Huey Done
Louie Done
Dewey Done
All Done in 5.0259197 seconds.
Three 5 second operations complete in 5.03 seconds. All in parallel.
If you want to stop the computation early just call subscription.Dispose().
NuGet "System.Reactive" to get the bits.
Teaser: guys, this question is not about how to implement retry policy. It's about correct completion of a TPL Dataflow block.
This question is mostly a continuation of my previous question Retry policy within ITargetBlock. The answer to this question was #svick's smart solution that utilizes TransformBlock (source) and TransformManyBlock (target). The only problem left is to complete this block in a right way: wait for all the retries to be completed first, and then complete the target block. Here is what I ended up with (it's just a snippet, don't pay too many attention to a non-threadsafe retries set):
var retries = new HashSet<RetryingMessage<TInput>>();
TransformManyBlock<RetryableMessage<TInput>, TOutput> target = null;
target = new TransformManyBlock<RetryableMessage<TInput>, TOutput>(
async message =>
{
try
{
var result = new[] { await transform(message.Data) };
retries.Remove(message);
return result;
}
catch (Exception ex)
{
message.Exceptions.Add(ex);
if (message.RetriesRemaining == 0)
{
if (failureHandler != null)
failureHandler(message.Exceptions);
retries.Remove(message);
}
else
{
retries.Add(message);
message.RetriesRemaining--;
Task.Delay(retryDelay)
.ContinueWith(_ => target.Post(message));
}
return null;
}
}, dataflowBlockOptions);
source.LinkTo(target);
source.Completion.ContinueWith(async _ =>
{
while (target.InputCount > 0 || retries.Any())
await Task.Delay(100);
target.Complete();
});
The idea is to perform some kind of polling and verify whether there are still messages that waiting to be processed and there are no messages that require retrying. But in this solution I don't like the idea of polling.
Yes, I can encapsulate the logic of adding/removing retries into a separate class, and even e.g. perform some action when the set of retries becomes empty, but how to deal with target.InputCount > 0 condition? There is not such a callback that get called when there are no pending messages for the block, so it seems that verifying target.ItemCount in a loop with a small delay is an only option.
Does anybody knows a smarter way to achieve this?
Maybe a ManualResetEvent can do the trick for you.
Add a public property to TransformManyBlock
private ManualResetEvent _signal = new ManualResetEvent(false);
public ManualResetEvent Signal { get { return _signal; } }
And here you go:
var retries = new HashSet<RetryingMessage<TInput>>();
TransformManyBlock<RetryableMessage<TInput>, TOutput> target = null;
target = new TransformManyBlock<RetryableMessage<TInput>, TOutput>(
async message =>
{
try
{
var result = new[] { await transform(message.Data) };
retries.Remove(message);
// Sets the state of the event to signaled, allowing one or more waiting threads to proceed
if(!retries.Any()) Signal.Set();
return result;
}
catch (Exception ex)
{
message.Exceptions.Add(ex);
if (message.RetriesRemaining == 0)
{
if (failureHandler != null)
failureHandler(message.Exceptions);
retries.Remove(message);
// Sets the state of the event to signaled, allowing one or more waiting threads to proceed
if(!retries.Any()) Signal.Set();
}
else
{
retries.Add(message);
message.RetriesRemaining--;
Task.Delay(retryDelay)
.ContinueWith(_ => target.Post(message));
}
return null;
}
}, dataflowBlockOptions);
source.LinkTo(target);
source.Completion.ContinueWith(async _ =>
{
//Blocks the current thread until the current WaitHandle receives a signal.
target.Signal.WaitOne();
target.Complete();
});
I am not sure where your target.InputCount is set. So at the place you change target.InputCount you can add following code:
if(InputCount == 0) Signal.Set();
Combining hwcverwe answer and JamieSee comment could be the ideal solution.
First, you need to create more than one event:
var signal = new ManualResetEvent(false);
var completedEvent = new ManualResetEvent(false);
Then, you have to create an observer, and subscribe to the TransformManyBlock, so you are notified when a relevant event happens:
var observer = new RetryingBlockObserver<TOutput>(completedEvent);
var observable = target.AsObservable();
observable.Subscribe(observer);
The observable can be quite easy:
private class RetryingBlockObserver<T> : IObserver<T> {
private ManualResetEvent completedEvent;
public RetryingBlockObserver(ManualResetEvent completedEvent) {
this.completedEvent = completedEvent;
}
public void OnCompleted() {
completedEvent.Set();
}
public void OnError(Exception error) {
//TODO
}
public void OnNext(T value) {
//TODO
}
}
And you can wait for either the signal, or completion (exhaustion of all the source items), or both
source.Completion.ContinueWith(async _ => {
WaitHandle.WaitAll(completedEvent, signal);
// Or WaitHandle.WaitAny, depending on your needs!
target.Complete();
});
You can inspect the result value of WaitAll to understand which event was set, and react accordingly.
You can also add other events to the code, passing them to the observer, so that it can set them when needed. You can differentiate your behaviour and respond differently when an error is raised, for example