I have a very basic, linear pipeline, in which I'd like to propagate completion and wait until everything completes:
static void Main(string[] args)
{
ExecutePipeline().Wait();
}
static async Task ExecutePipeline()
{
var addBlock = new TransformBlock<int, int>(x =>
{
var result = x + 2;
Console.WriteLine(result);
return result;
});
var subBlock = new TransformBlock<int, int>(x =>
{
var result = x - 2;
Console.WriteLine(result);
return result;
});
var mulBlock = new TransformBlock<int, int>(x =>
{
var result = x * 2;
Console.WriteLine(result);
return result;
});
var divBlock = new TransformBlock<int, int>(x =>
{
var result = x / 2;
Console.WriteLine(result);
return result;
});
var flowOptions = new DataflowLinkOptions { PropagateCompletion = true };
addBlock.LinkTo(mulBlock, flowOptions);
mulBlock.LinkTo(subBlock, flowOptions);
subBlock.LinkTo(divBlock, flowOptions);
addBlock.Post(4);
addBlock.Complete();
mulBlock.Complete();
subBlock.Complete();
await divBlock.Completion;
}
Unfortunately, in its current state, only the result of addBlock gets printed and the program terminates, instead of printing all of the results before termination.
If I comment out all of the lines which call Complete() on their blocks or if I leave addBlock.Complete() uncommented, I get a printout of all results in the pipeline, but the program never ends, since the completion is not propagated. However, if I unblock either mulBlock.Complete() or subBlock.Complete(), similarly to how the default code behaves, the program prints out the result of addBlock and terminates.
What's interesting is that uncommenting either of those two last mentioned blocks or all of them has the same behavior, which makes me question how the completion propagates if one of them is commented. Obviously, I'm missing something in the logic, but I just can't figure out what it is. How would I accomplish the desired behavior of printing all of the results?
EDIT:
So, I finally found something that worked for me at https://stackoverflow.com/a/26803579/2006048
It appears that I needed to change the last block of code to simply this:
addBlock.Post(4);
addBlock.Complete();
await addBlock.Completion;
The original code did not work because Complete() was called on each block before data could propagate, so it was a case of a race condition.
However, with this new edited code, it's calling Complete() on addBlock and awaits for its completion. This makes the program work as intended, but leaves me yet more confused. Why is it that Completion must be awaited from the addBlock and not from the last block in the chain, which is divBlock? I would think that Completion() is only called on addBlock because PropagationCompletion is set to true, but then I would think that we would wait for completion of the last block, not the first one.
If I await for the completion of mulBlock, then only the results of addBlock get printed. If I await for the completion of subBlock, the results of addBlock and mulBlock get printed. If I await for completion of divBlock, the results of addBlock, mulBlock and subBlock get printed.
I was basing my code on Stephen Cleary's Concurrency in C# Cookbook example (Section 4.1 Linking Blocks (Page 48)):
var multiplyBlock = new TransformBlock<int, int>(item => item * 2);
var subtractBlock = new TransformBlock<int, int>(item => item - 2);
var options = new DataflowLinkOptions { PropagateCompletion = true };
multiplyBlock.LinkTo(subtractBlock, options);
...
// The first block's completion is automatically propagated to the second block.
multiplyBlock.Complete();
await subtractBlock.Completion;
When I setup Cleary's code to match what I have, the same behavior is exhibited. Program prints result and terminates only when I await for multiplyBlock.Completion.
The problem is that a block completes only after all its queues are emptied, which includes the output queue. What happens in your case is that the completion propagates correctly, but then divBlock gets stuck in the "almost complete" mode, waiting for the item in its output queue to be removed.
To solve this, you can either change divBlock to be an ActionBlock, or you can link it to a DataflowBlock.NullTarget<int>().
Related
I have a bunch of requests to process, some of which may complete synchronously.
I'd like to gather all results that are immediately available and return them early, while waiting for the rest.
Roughly like this:
List<Task<Result>> tasks = new ();
List<Result> results = new ();
foreach (var request in myRequests) {
var task = request.ProcessAsync();
if (task.IsCompleted)
results.Add(task.Result); // or Add(await task) ?
else
tasks.Add(task);
}
// send results that are available "immediately" while waiting for the rest
if (results.Count > 0) SendResults(results);
results = await Task.WhenAll(tasks);
SendResults(results);
I'm not sure whether relying on IsCompleted might be a bad idea; could there be situations where its result cannot be trusted, or where it may change back to false again, etc.?
Similarly, could it be dangerous to use task.Result even after checking IsCompleted, should one always prefer await task? What if were using ValueTask instead of Task?
I'm not sure whether relying on IsCompleted might be a bad idea; could there be situations where its result cannot be trusted...
If you're in a multithreaded context, it's possible that IsCompleted could return false at the moment when you check on it, but it completes immediately thereafter. In cases like the code you're using, the cost of this happening would be very low, so I wouldn't worry about it.
or where it may change back to false again, etc.?
No, once a Task completes, it cannot uncomplete.
could it be dangerous to use task.Result even after checking IsCompleted.
Nope, that should always be safe.
should one always prefer await task?
await is a great default when you don't have a specific reason to do something else, but there are a variety of use cases where other patterns might be useful. The use case you've highlighted is a good example, where you want to return the results of finished tasks without awaiting all of them.
As Stephen Cleary mentioned in a comment below, it may still be worthwhile to use await to maintain expected exception behavior. You might consider doing something more like this:
var requestsByIsCompleted = myRequests.ToLookup(r => r.IsCompleted);
// send results that are available "immediately" while waiting for the rest
SendResults(await Task.WhenAll(requestsByIsCompleted[true]));
SendResults(await Task.WhenAll(requestsByIsCompleted[false]));
What if were using ValueTask instead of Task?
The answers above apply equally to both types.
You could use code like this to continually send the results of completed tasks while waiting on others to complete.
foreach (var request in myRequests)
{
tasks.Add(request.ProcessAsync());
}
// wait for at least one task to be complete, then send all available results
while (tasks.Count > 0)
{
// wait for at least one task to complete
Task.WaitAny(tasks.ToArray());
// send results for each completed task
var completedTasks = tasks.Where(t => t.IsCompleted);
var results = completedTasks.Where(t => t.IsCompletedSuccessfully).Select(t => t.Result).ToList();
SendResults(results);
// TODO: handle completed but failed tasks here
// remove completed tasks from the tasks list and keep waiting
tasks.RemoveAll(t => completedTasks.Contains(t));
}
Using only await you can achieve the desired behavior:
async Task ProcessAsync(MyRequest request, Sender sender)
{
var result = await request.ProcessAsync();
await sender.SendAsync(result);
}
...
async Task ProcessAll()
{
var tasks = new List<Task>();
foreach(var request in requests)
{
var task = ProcessAsync(request, sender);
// Dont await until all requests are queued up
tasks.Add(task);
}
// Await on all outstanding requests
await Task.WhenAll(tasks);
}
There are already good answers, but in addition of them here is my suggestion too, on how to handle multiple tasks and process each task differently, maybe it will suit your needs. My example is with events, but you can replace them with some kind of state management that fits your needs.
public interface IRequestHandler
{
event Func<object, Task> Ready;
Task ProcessAsync();
}
public class RequestHandler : IRequestHandler
{
// Hier where you wraps your request:
// private object request;
private readonly int value;
public RequestHandler(int value)
=> this.value = value;
public event Func<object, Task> Ready;
public async Task ProcessAsync()
{
await Task.Delay(1000 * this.value);
// Hier where you calls:
// var result = await request.ProcessAsync();
//... then do something over the result or wrap the call in try catch for example
var result = $"RequestHandler {this.value} - [{DateTime.Now.ToLongTimeString()}]";
if (this.Ready is not null)
{
// If result passes send the result to all subscribers
await this.Ready.Invoke($"RequestHandler {this.value} - [{DateTime.Now.ToLongTimeString()}]");
}
}
}
static void Main()
{
var a = new RequestHandler(1);
a.Ready += PrintAsync;
var b = new RequestHandler(2);
b.Ready += PrintAsync;
var c = new RequestHandler(3);
c.Ready += PrintAsync;
var d= new RequestHandler(4);
d.Ready += PrintAsync;
var e = new RequestHandler(5);
e.Ready += PrintAsync;
var f = new RequestHandler(6);
f.Ready += PrintAsync;
var requests = new List<IRequestHandler>()
{
a, b, c, d, e, f
};
var tasks = requests
.Select(x => Task.Run(x.ProcessAsync));
// Hier you must await all of the tasks
Task
.Run(async () => await Task.WhenAll(tasks))
.Wait();
}
static Task PrintAsync(object output)
{
Console.WriteLine(output);
return Task.CompletedTask;
}
I've got this code snippet, trying to use TransformBlock to start a code execution, as below:
public static void Main(string[] args)
{
var multiplyBlock = new TransformBlock<int, int>(x => x * 2);
var additionBlock = new TransformBlock<int, int>(x => x + 2);
multiplyBlock.LinkTo(additionBlock, new DataflowLinkOptions { PropagateCompletion = true });
multiplyBlock.Post(3);
additionBlock.Completion.ContinueWith(x => Console.WriteLine(x));
multiplyBlock.Complete();
additionBlock.Completion.Wait();
}
But when I run this code, it hangs and prints nothing. I tried to debug it, I found all code lines have finished but at the end of function, the program hangs. So what's happening here, and how to fix it?
Thanks.
You need to use an ActionBlock to consume the output from the TransFormBlock, like so:
public static void Main(string[] args)
{
var multiplyBlock = new TransformBlock<int, int>(x => x * 2);
var additionBlock = new TransformBlock<int, int>(x => x + 2);
var outputBlock = new ActionBlock<int>(Console.WriteLine);
multiplyBlock.LinkTo(additionBlock, new DataflowLinkOptions { PropagateCompletion = true });
additionBlock.LinkTo(outputBlock, new DataflowLinkOptions { PropagateCompletion = true });
multiplyBlock.Post(3);
multiplyBlock.Complete();
outputBlock.Completion.Wait();
}
If you just try to use TransformBlock nothing happens because there's nothing to consume all the output.
Think of it this way: a TransformBlock has an input and an output, so the output must be consumed somehow. An ActionBlock only has an input, and it will be called repeatedly with the output from a TransformBlock. (This is somewhat of an oversimplification, but it should help you understand the wiring.)
In the example above, the output of the multiply block is wired to the input of the addition block, and the output of the addition block is wired to the input of the output block.
The input to the multiply block must come from somewhere, and here it comes from the call to multiplyBlock.Post(3);.
(Actually, if you want to consume the output of a TransformBlock explicitly, you can do so by calling TransformBlock.Receive() (or one of the other receive methods) in a loop to process the data, but this is not necessary since its much easier to use an ActionBlock to consume the transformed data.)
This documentation may be useful to you.
This question already has answers here:
Using async/await for multiple tasks
(8 answers)
Closed 4 years ago.
I have a method in which I'm retrieving a list of deployments. For each deployment I want to retrieve an associated release. Because all calls are made to an external API, I now have a foreach-loop in which those calls are made.
public static async Task<List<Deployment>> GetDeployments()
{
try
{
var depjson = await GetJson($"{BASEURL}release/deployments?deploymentStatus=succeeded&definitionId=2&definitionEnvironmentId=5&minStartedTime={MinDateTime}");
var deployments = (JsonConvert.DeserializeObject<DeploymentWrapper>(depjson))?.Value?.OrderByDescending(x => x.DeployedOn)?.ToList();
foreach (var deployment in deployments)
{
var reljson = await GetJson($"{BASEURL}release/releases/{deployment.ReleaseId}");
deployment.Release = JsonConvert.DeserializeObject<Release>(reljson);
}
return deployments;
}
catch (Exception)
{
throw;
}
}
This all works perfectly fine. However, I do not like the await in the foreach-loop at all. I also believe this is not considered good practice. I just don't see how to refactor this so the calls are made parallel, because the result of each call is used to set a property of the deployment.
I would appreciate any suggestions on how to make this method faster and, whenever possible, avoid the await-ing in the foreach-loop.
There is nothing wrong with what you are doing now. But there is a way to call all tasks at once instead of waiting for a single task, then processing it and then waiting for another one.
This is how you can turn this:
wait for one -> process -> wait for one -> process ...
into
wait for all -> process -> done
Convert this:
foreach (var deployment in deployments)
{
var reljson = await GetJson($"{BASEURL}release/releases/{deployment.ReleaseId}");
deployment.Release = JsonConvert.DeserializeObject<Release>(reljson);
}
To:
var deplTasks = deployments.Select(d => GetJson($"{BASEURL}release/releases/{d.ReleaseId}"));
var reljsons = await Task.WhenAll(deplTasks);
for(var index = 0; index < deployments.Count; index++)
{
deployments[index].Release = JsonConvert.DeserializeObject<Release>(reljsons[index]);
}
First you take a list of unfinished tasks. Then you await it and you get a collection of results (reljson's). Then you have to deserialize them and assign to Release.
By using await Task.WhenAll() you wait for all the tasks at the same time, so you should see a performance boost from that.
Let me know if there are typos, I didn't compile this code.
Fcin suggested to start all Tasks, await for them all to finish and then start deserializing the fetched data.
However, if the first Task is already finished, but the second task not, and internally the second task is awaiting, the first task could already start deserializing. This would shorten the time that your process is idly waiting.
So instead of:
var deplTasks = deployments.Select(d => GetJson($"{BASEURL}release/releases/{d.ReleaseId}"));
var reljsons = await Task.WhenAll(deplTasks);
for(var index = 0; index < deployments.Count; index++)
{
deployments[index].Release = JsonConvert.DeserializeObject<Release>(reljsons[index]);
}
I'd suggest the following slight change:
// async fetch the Release data of Deployment:
private async Task<Release> FetchReleaseDataAsync(Deployment deployment)
{
var reljson = await GetJson($"{BASEURL}release/releases/{deployment.ReleaseId}");
return JsonConvert.DeserializeObject<Release>(reljson);
}
// async fill the Release data of Deployment:
private async Task FillReleaseDataAsync(Deployment deployment)
{
deployment.Release = await FetchReleaseDataAsync(deployment);
}
Then your procedure is similar to the solution that Fcin suggested:
IEnumerable<Task> tasksFillDeploymentWithReleaseData = deployments.
.Select(deployment => FillReleaseDataAsync(deployment)
.ToList();
await Task.WhenAll(tasksFillDeploymentWithReleaseData);
Now if the first task has to wait while fetching the release data, the 2nd task begins and the third etc. If the first task already finished fetching the release data, but the other tasks are awaiting for their release data, the first task starts already deserializing it and assigns the result to deployment.Release, after which the first task is complete.
If for instance the 7th task got its data, but the 2nd task is still waiting, the 7th task can deserialize and assign the data to deployment.Release. Task 7 is completed.
This continues until all tasks are completed. Using this method there is less waiting time because as soon as one task has its data it is scheduled to start deserializing
If i understand you right and you want to make the var reljson = await GetJson parralel:
Try this:
Parallel.ForEach(deployments, (deployment) =>
{
var reljson = await GetJson($"{BASEURL}release/releases/{deployment.ReleaseId}");
deployment.Release = JsonConvert.DeserializeObject<Release>(reljson);
});
you might limit the number of parallel executions such as:
Parallel.ForEach(
deployments,
new ParallelOptions { MaxDegreeOfParallelism = 4 },
(deployment) =>
{
var reljson = await GetJson($"{BASEURL}release/releases/{deployment.ReleaseId}");
deployment.Release = JsonConvert.DeserializeObject<Release>(reljson);
});
you might also want to be able to break the loop:
Parallel.ForEach(deployments, (deployment, state) =>
{
var reljson = await GetJson($"{BASEURL}release/releases/{deployment.ReleaseId}");
deployment.Release = JsonConvert.DeserializeObject<Release>(reljson);
if (noFurtherProcessingRequired) state.Break();
});
I've got a class the implements a dataflow composed of 3 steps using TPL Dataflow.
In the constructor I create the steps as TransformBlocks and link them up using LinkTo with DataflowLinkOptions.PropagateCompletion set to true. The class exposes a single method which kicks of the workflow by calling SendAsync on the 1st step. The method returns the "Completion" property of the final step of the workflow.
At the moment the steps in the workflow appear to execute as expected but final step never completes unless I explicitly call Complete on it. But doing that short-circuits the workflow and none of the steps are executed? What am I doing wrong?
public class MessagePipeline {
private TransformBlock<object, object> step1;
private TransformBlock<object, object> step2;
private TransformBlock<object, object> step3;
public MessagePipeline() {
var linkOptions = new DataflowLinkOptions { PropagateCompletion = true };
step1 = new TransformBlock<object, object>(
x => {
Console.WriteLine("Step1...");
return x;
});
step2 = new TransformBlock<object, object>(
x => {
Console.WriteLine("Step2...");
return x;
});
step3 = new TransformBlock<object, object>(
x => {
Console.WriteLine("Step3...");
return x;
});
step1.LinkTo(step2, linkOptions);
step2.LinkTo(step3, linkOptions);
}
public Task Push(object message) {
step1.SendAsync(message);
step1.Complete();
return step3.Completion;
}
}
...
public class Program {
public static void Main(string[] args) {
var pipeline = new MessagePipeline();
var result = pipeline.Push("Hello, world!");
result.ContinueWith(_ => Console.WriteLine("Completed"));
Console.ReadLine();
}
}
When you link the steps, you need to pass a DataflowLinkOptions with the the PropagateCompletion property set to true to propagate both completion and errors. Once you do that, calling Complete() on the first block will propagete completion to downstream blocks.
Once a block receives the completion event, it finishes processing then notifies its linked downstream targets.
This way you can post all your data to the first step and call Complete(). The final block will only complete when all upstream blocks have completed.
For example,
var linkOptions=new DataflowLinkOptions { PropagateCompletion = true};
myFirstBlock.LinkTo(mySecondBlock,linkOptions);
mySecondBlock.LinkTo(myFinalBlock,linkOptions);
foreach(var message in messages)
{
myFirstBlock.Post(message);
}
myFirstBlock.Complete();
......
await myFinalBlock.Completion;
PropagateCompletion isn't true by default because in more complex scenarios (eg non-linear flows, or dynamically changing flows) you don't want completion and errors to propagate automatically. You may also want to avoid automatic completion if you want to handle errors without terminating the entire flow.
Way back when TPL Dataflow was in beta the default was true but this was changed on RTM
UPDATE
The code never completes because the final step is a TransformBlock with no linked target to receive its output. This means that even though the block received the completion signal, it hasn't finished all its work and can't change its own Completion status.
Changing it to an ActionBlock<object> removes the issue.
You need to explicitly call Complete.
Given the following:
BufferBlock<int> sourceBlock = new BufferBlock<int>();
TransformBlock<int, int> targetBlock = new TransformBlock<int, int>(element =>
{
return element * 2;
});
sourceBlock.LinkTo(targetBlock, new DataflowLinkOptions { PropagateCompletion = true });
//feed some elements into the buffer block
for(int i = 1; i <= 1000000; i++)
{
sourceBlock.SendAsync(i);
}
sourceBlock.Complete();
targetBlock.Completion.ContinueWith(_ =>
{
//notify completion of the target block
});
The targetBlock never seems to complete and I think the reason is that all the items in the TransformBlock targetBlock are waiting in the output queue as I have not linked the targetBlock to any other Dataflow block. However, what I actually want to achieve is a notification when (A) the targetBlock is notified of completion AND (B) the input queue is empty. I do not want to care whether items still sit in the output queue of the TransformBlock. How can I go about that? Is the only way to get what I want to query the completion status of the sourceBlock AND to make sure the InputCount of the targetBlock is zero? I am not sure this is very stable (is the sourceBlock truly only marked completed if the last item in the sourceBlock has been passed to the targetBlock?). Is there a more elegant and more efficient way to get to the same goal?
Edit: I just noticed even the "dirty" way to check on completion of the sourceBlock AND InputCount of the targetBlock being zero is not trivial to implement. Where would that block sit? It cannot be within the targetBlock because once above two conditions are met obviously no message is processed within targetBlock anymore. Also checking on the completion status of the sourceBlock introduces a lot of inefficiency.
I believe you can't directly do this. It's possible you could get this information from some private fields using reflection, but I wouldn't recommend doing that.
But you can do this by creating custom blocks. In the case of Complete() it's simple: just create a block that forwards each method to the original block. Except Complete(), where it will also log it.
In the case of figuring out when processing of all items is complete, you could link your block to an intermediate BufferBlock. This way, the output queue will be emptied quickly and so checking Completed of the internal block would give you fairly accurate measurement of when the processing is complete. This would affect your measurements, but hopefully not significantly.
Another option would be to add some logging at the end of the block's delegate. This way, you could see when processing of the last item was finished.
It would be nice if the TransformBlock had a ProcessingCompleted event that would fire when the block has completed the processing of all messages in its queue, but there is no such event. Below is an attempt to rectify this omission. The CreateTransformBlockEx method accepts an Action<Exception> handler, that is invoked when this "event" occurs.
The intention was to always invoke the handler before the final completion of the block. Unfortunately in the case that the supplied CancellationToken is canceled, the completion (cancellation) happens first, and the handler is invoked some milliseconds later. To fix this inconsistency would require some tricky workarounds, and may had other unwanted side-effects, so I am leaving it as is.
public static IPropagatorBlock<TInput, TOutput>
CreateTransformBlockEx<TInput, TOutput>(Func<TInput, Task<TOutput>> transform,
Action<Exception> onProcessingCompleted,
ExecutionDataflowBlockOptions dataflowBlockOptions = null)
{
if (onProcessingCompleted == null)
throw new ArgumentNullException(nameof(onProcessingCompleted));
dataflowBlockOptions = dataflowBlockOptions ?? new ExecutionDataflowBlockOptions();
var transformBlock = new TransformBlock<TInput, TOutput>(transform,
dataflowBlockOptions);
var bufferBlock = new BufferBlock<TOutput>(dataflowBlockOptions);
transformBlock.LinkTo(bufferBlock);
PropagateCompletion(transformBlock, bufferBlock, onProcessingCompleted);
return DataflowBlock.Encapsulate(transformBlock, bufferBlock);
async void PropagateCompletion(IDataflowBlock block1, IDataflowBlock block2,
Action<Exception> completionHandler)
{
try
{
await block1.Completion.ConfigureAwait(false);
}
catch { }
var exception =
block1.Completion.IsFaulted ? block1.Completion.Exception : null;
try
{
// Invoke the handler before completing the second block
completionHandler(exception);
}
finally
{
if (exception != null) block2.Fault(exception); else block2.Complete();
}
}
}
// Overload with synchronous lambda
public static IPropagatorBlock<TInput, TOutput>
CreateTransformBlockEx<TInput, TOutput>(Func<TInput, TOutput> transform,
Action<Exception> onProcessingCompleted,
ExecutionDataflowBlockOptions dataflowBlockOptions = null)
{
return CreateTransformBlockEx<TInput, TOutput>(
x => Task.FromResult(transform(x)), onProcessingCompleted,
dataflowBlockOptions);
}
The code of the local function PropagateCompletion mimics the source code of the LinkTo built-in method, when invoked with the PropagateCompletion = true option.
Usage example:
var httpClient = new HttpClient();
var downloader = CreateTransformBlockEx<string, string>(async url =>
{
return await httpClient.GetStringAsync(url);
}, onProcessingCompleted: ex =>
{
Console.WriteLine($"Download completed {(ex == null ? "OK" : "Error")}");
}, new ExecutionDataflowBlockOptions()
{
MaxDegreeOfParallelism = 10
});
First thing it is not right to use a IPropagator Block as a leaf terminal. But still your requirement can be fulfilled by asynchronously checking the output buffer of the TargetBlock for output messages and then consuming then so that the buffer could be emptied.
` BufferBlock<int> sourceBlock = new BufferBlock<int>();
TransformBlock<int, int> targetBlock = new TransformBlock<int, int>
(element =>
{
return element * 2;
});
sourceBlock.LinkTo(targetBlock, new DataflowLinkOptions {
PropagateCompletion = true });
//feed some elements into the buffer block
for (int i = 1; i <= 100; i++)
{
sourceBlock.SendAsync(i);
}
sourceBlock.Complete();
bool isOutputAvailable = await targetBlock.OutputAvailableAsync();
while(isOutputAvailable)
{
int value = await targetBlock.ReceiveAsync();
isOutputAvailable = await targetBlock.OutputAvailableAsync();
}
await targetBlock.Completion.ContinueWith(_ =>
{
Console.WriteLine("Target Block Completed");//notify completion of the target block
});
`