Nested asynchronous tasks - c#

I would like to know if it's possible to improve this code for better performance. I'm new to the whole async thing on the serverside, so please bear with me here:
con.GetGame(id, game => {
foreach(Player p in game.Team1)
{
p.SomeExtraDetails = GetPlayerDetails(p.Id);
}
// I would like the player data to be set on all players
// before ending up here
});
private PlayerDetails GetPlayerDetails(double playerId)
{
var task = con.GetPlayer(playerId);
PlayerDetails ret = null;
Task continuation = task.ContinueWith(t =>
{
ret = t.Result;
});
continuation.Wait();
return ret;
}
If I got it right, continuation.Wait(); blocks the main thread.
Is there any way to make the tasks run simultaneously?

Ideally, you'd make these operations asynchronous all the way down:
private Task<PlayerDetails> GetPlayerDetailsAsync(double playerId)
{
return con.GetPlayer(playerId);
}
con.GetGame(id, game => {
var tasks = game.Team1
.Select(p => new { Player=p, Details=GetPlayerDetailsAsync(p.Id)})
.ToList(); // Force all tasks to start...
foreach(var t in tasks)
{
t.Player.SomeExtraDetails = await t.Details;
}
// all player data is now set on all players
});
If that isn't an option (ie: you're not using VS 2012), you could simplify your code to:
// This is a more efficient version of your existing code
private PlayerDetails GetPlayerDetails(double playerId)
{
var task = con.GetPlayer(playerId);
return task.Result;
}
con.GetGame(id, game => {
// This will run all at once, but block until they're done
Parallel.ForEach(game.Team1, p =>
{
p.SomeExtraDetails = GetPlayerDetails(p.Id);
});
});

consider using Parallel.ForEach in your GetGame page instead of Task.ContinueWith

Alternative solution without LINQ (although I like Reed Copsey's solution). However, beware that, as pointed out in the comments, this solution introduces an overhead by encapsulating the call to GetPlayerDetailsAsync() inside Tasks created by Task.Run().
Requires .NET 4.5 and C# 5.
con.GetGame(id, game => {
var tasks = new List<Task>();
foreach(Player p in game.Team1)
{
tasks.Add(Task.Run(async () => p.SomeExtraDetails = await GetPlayerDetailsAsync(p.Id)));
}
Task.WaitAll(tasks.ToArray());
});
private Task<PlayerDetails> GetPlayerDetailsAsync(double playerId)
{
return con.GetPlayerAsync(playerId);
});
Further, in order to catch up on the Task-based Asynchronous Pattern (TAP) with .NET 4.5 I highly recommend reading: Task-based Asynchronous Pattern - by Stephen Toub, Microsoft.

Related

C# How to remove a completed Task from Task[]

I have an array of tasks running like this: (the number of tasks in the array is not fixed to 2, it can vary from time to time).
Task[] templates = new Task[2] {
Task.Factory.StartNew( () => { x.foo(); }),
Task.Factory.StartNew( () => { y.bar(); })
};
Then I wait for any of them to finish.
int taskID;
taskID = Task.WaitAny(templates);
As soon a task has finished I inspect various parameters inside the object (x or y) and if I don't have the result I expected I want to remove the finished task from the template array. something like this
template[taskID].RemoveThisTaskFromArray() // This is pseudo code to describe what I want.
taskID = Task.WaitAny(templates);
I will do this in a loop until the task array template[] is empty or I decide to abort the operation and break out of the loop.
int taskID;
do {
taskID = Task.WaitAny(templates);
if(template[taskID].valid == true) { // <-- example of test for success
// stop the remaining task in the template[] array.
break;
}
// remove the completed task so we can use WaitAany() again.
} while (template.Length > 0);
It seems like Task[] doesn't support array members so There is no Task[].RemoveAt() function.
I hope you get my point. Any way to solve this?
Just use a List<Task> instead. But also, use Task.Run instead of Task.Factory.StartNew for reasons described in detail here.
var templates = new List<Task> {
Task.Run( () => { x.foo(); }),
Task.Run( () => { y.bar(); })
};
int taskID;
do {
taskID = Task.WaitAny(templates.ToArray());
if(templates[taskID].valid == true) { // <-- example of test for success
// stop the remaining task in the template[] array.
break;
}
templates.RemoveAt(taskID);
} while (template.Count > 0);
This calls .ToArray(), which will creates a new array (preserving order) so is slightly inefficient, but it works. This is really the easiest way to give Task.WaitAll the array it's looking for, while preserving your ability to easily remove items.
But I don't know what you're trying to do with template[taskID].valid. Did you mean to use your templates (with an 's') array? (that's what I assumed) But Task doesn't have any property called valid. Did you mean to look at the Status property?
However, you can consider using async and await and use Task.WhenAny to avoid blocking the current thread while you wait. Task.WhenAll also accepts any IEnumerable, so you don't need to call .ToArray().
do {
var doneTask = await Task.WhenAny(templates);
// This will return any result, but also throw any exception that
// might have happened inside the task.
await doneTask;
templates.Remove(doneTask);
} while (templates.Count > 0);
There are some very well-written articles about asynchronous programming, starting here: Asynchronous programming with async and await
Remove completed task from list of Tasks can be done using linq like this
templates.RemoveAll(x => x.IsCompleted);

Parallel Task and Subtasks workflow

I'm new to C# threads and tasks and I'm trying to develop a workflow but without success probably because I'm mixing tasks with for iterations...
The point is:
I've got a bunch of lists, and inside each one there are some things to do, and need to make them work as much parallel and less blocking possible, and as soon as each subBunchOfThingsTodo is done ( it means every thing to do inside it is done parallely) it has do some business(DoSomethingAfterEveryThingToDoOfThisSubBunchOfThingsAreDone()).
e.g:
bunchOfSubBunchsOfThingsTodo
subBunchOfThingsTodo
ThingToDo1
ThingToDo2
subBunchOfThingsTodo
ThingToDo1
ThingToDo2
ThingToDo3
subBunchOfThingsTodo
ThingToDo1
ThingToDo2...
This is how I'm trying but unfortunately each iteration waits the previous one bunchOfThingsToDo and I need them to work in parallel.
The same happens to the things to do , they wait the previous thing to start...
List<X> bunchOfSubBunchsOfThingsTodo = getBunchOfSubBunchsOfThingsTodo();
foreach (var subBunchOfThingsToDo in bunchOfSubBunchsOfThingsTodo)
{
int idSubBunchOfThingsToDo = subBunchOfThingsToDo.ThingsToDo.FirstOrDefault().IdSubBunchOfThingsToDo;
var parent = Task.Factory.StartNew(() =>
{
foreach (var thingToDo in subBunchOfThingsToDo.ThingsToDo)
{
var child = Task.Factory.StartNew(() =>
{
//Do some stuff with thingToDo... Here I call several business methods
});
}
});
parent.Wait();
DoSomethingAfterEveryThingToDoOfThisSubBunchOfThingsAreDone(idSubBunchOfThingsToDo);
}
You may want to try using Task.WhenAll and playing with linq to generate a collection of hot tasks:
static async void ProcessThingsToDo(IEnumerable<ThingToDo> bunchOfThingsToDo)
{
IEnumerable<Task> GetSubTasks(ThingToDo thing)
=> thing.SubBunchOfThingsToDo.Select( async subThing => await Task.Run(subThing));
var tasks = bunchOfThingsToDo
.Select(async thing => await Task.WhenAll(GetSubTasks(thing)));
await Task.WhenAll(tasks);
}
This way you are running each subThingToDo on a separate task and you get only one Task composed by all subtasks for each thingToDo
EDIT
ThingToDo is a rather simple class in this sample:
class ThingToDo
{
public IEnumerable<Action> SubBunchOfThingsToDo { get; }
}
With minimum changes of your code you can try this way:
var toWait = new List<Task>();
List<X> bunchOfSubBunchsOfThingsTodo = getBunchOfSubBunchsOfThingsTodo();
foreach (var subBunchOfThingsToDo in bunchOfSubBunchsOfThingsTodo)
{
int idSubBunchOfThingsToDo = subBunchOfThingsToDo.ThingsToDo.FirstOrDefault().IdSubBunchOfThingsToDo;
var parent = Task.Factory.StartNew(() =>
{
Parallel.ForEach(subBunchOfThingsToDo.ThingsToDo,
thingToDo =>
{
//Do some stuff with thingToDo... Here I call several business methods
});
});
//parent.Wait();
var handle = parent.ContinueWith((x) =>
{
DoSomethingAfterEveryThingToDoOfThisSubBunchOfThingsAreDone(idSubBunchOfThingsToDo);
})
.Start();
toWait.Add(handle);
}
Task.WhenAll(toWait);
Thanks to downvoters team, that advised 'good' solution:
var bunchOfSubBunchsOfThingsTodo = getBunchOfSubBunchsOfThingsTodo();
var toWait = bunchOfSubBunchsOfThingsTodo
.Select(subBunchOfThingsToDo =>
{
return Task.Run(() =>
{
int idSubBunchOfThingsToDo = subBunchOfThingsToDo.ThingsToDo.FirstOrDefault().IdSubBunchOfThingsToDo;
Parallel.ForEach(subBunchOfThingsToDo.ThingsToDo,
thingToDo =>
{
//Do some stuff with thingToDo... Here I call several business methods
});
DoSomethingAfterEveryThingToDoOfThisSubBunchOfThingsAreDone(idSubBunchOfThingsToDo);
});
});
Task.WhenAll(toWait);

Parallel ForEach wait 500 ms before spawning

I have this situation:
var tasks = new List<ITask> ...
Parallel.ForEach(tasks, currentTask => currentTask.Execute() );
Is it possible to instruct PLinq to wait for 500ms before the next thread is spawned?
System.Threading.Thread.Sleep(5000);
You are using Parallel.Foreach totally wrong, You should make a special Enumerator that rate limits itself to getting data once every 500 ms.
I made some assumptions on how your DTO works due to you not providing any details.
private IEnumerator<SomeResource> GetRateLimitedResource()
{
SomeResource someResource = null;
do
{
someResource = _remoteProvider.GetData();
if(someResource != null)
{
yield return someResource;
Thread.Sleep(500);
}
} while (someResource != null);
}
here is how your paralell should look then
Parallel.ForEach(GetRateLimitedResource(), SomeFunctionToProcessSomeResource);
There are already some good suggestions. I would agree with others that you are using PLINQ in a manner it wasn't meant to be used.
My suggestion would be to use System.Threading.Timer. This is probably better than writing a method that returns an IEnumerable<> that forces a half second delay, because you may not need to wait the full half second, depending on how much time has passed since your last API call.
With the timer, it will invoke a delegate that you've provided it at the interval you specify, so even if the first task isn't done, a half second later it will invoke your delegate on another thread, so there won't be any extra waiting.
From your example code, it sounds like you have a list of tasks, in this case, I would use System.Collections.Concurrent.ConcurrentQueue to keep track of the tasks. Once the queue is empty, turn off the timer.
You could use Enumerable.Aggregate instead.
var task = tasks.Aggregate((t1, t2) =>
t1.ContinueWith(async _ =>
{ Thread.Sleep(500); return t2.Result; }));
If you don't want the tasks chained then there is also the overload to Select assuming the tasks are in order of delay.
var tasks = Enumerable
.Range(1, 10)
.Select(x => Task.Run(() => x * 2))
.Select((x, i) => Task.Delay(TimeSpan.FromMilliseconds(i * 500))
.ContinueWith(_ => x.Result));
foreach(var result in tasks.Select(x => x.Result))
{
Console.WriteLine(result);
}
From the comments a better options would be to guard the resource instead of using the time delay.
static object Locker = new object();
static int GetResultFromResource(int arg)
{
lock(Locker)
{
Thread.Sleep(500);
return arg * 2;
}
}
var tasks = Enumerable
.Range(1, 10)
.Select(x => Task.Run(() => GetResultFromResource(x)));
foreach(var result in tasks.Select(x => x.Result))
{
Console.WriteLine(result);
}
In this case how about a Producer-Consumer pattern with a BlockingCollection<T>?
var tasks = new BlockingCollection<ITask>();
// add tasks, if this is an expensive process, put it out onto a Task
// tasks.Add(x);
// we're done producin' (allows GetConsumingEnumerable to finish)
tasks.CompleteAdding();
RunTasks(tasks);
With a single consumer thread:
static void RunTasks(BlockingCollection<ITask> tasks)
{
foreach (var task in tasks.GetConsumingEnumerable())
{
task.Execute();
// this may not be as accurate as you would like
Thread.Sleep(500);
}
}
If you have access to .Net 4.5 you can use Task.Delay:
static void RunTasks(BlockingCollection<ITask> tasks)
{
foreach (var task in tasks.GetConsumingEnumerable())
{
Task.Delay(500)
.ContinueWith(() => task.Execute())
.Wait();
}
}

async i/o and process results as they become available

I has a simple console app where I want to call many Urls in a loop and put the result in a database table. I am using .Net 4.5 and using async i/o to fetch the URL data. Here is a simplified version of what I am doing. All methods are async except for the database operation. Do you guys see any issues with this? Are there better ways of optimizing?
private async Task Run(){
var items = repo.GetItems(); // sync method to get list from database
var tasks = new List<Task>();
// add each call to task list and process result as it becomes available
// rather than waiting for all downloads
foreach(Item item in items){
tasks.Add(GetFromWeb(item.url).ContinueWith(response => { AddToDatabase(response.Result);}));
}
await Task.WhenAll(tasks); // wait for all tasks to complete.
}
private async Task<string> GetFromWeb(url) {
HttpResponseMessage response = await GetAsync(url);
return await response.Content.ReadAsStringAsync();
}
private void AddToDatabase(string item){
// add data to database.
}
Your solution is acceptable. But you should check out TPL Dataflow, which allows you to set up a dataflow "mesh" (or "pipeline") and then shove the data through it.
For a problem this simple, Dataflow won't really add much other than getting rid of the ContinueWith (I always find manual continuations awkward). But if you plan to add more steps or change your data flow in the future, Dataflow should be something you consider.
Your solution is pretty much correct, with just two minor mistakes (both of which cause compiler errors). First, you don't call ContinueWith on the result of List.Add, you need call continue with on the task and then add the continuation to your list, this is solved by just moving a parenthesis. You also need to call Result on the reponse Task.
Here is the section with the two minor changes:
tasks.Add(GetFromWeb(item.url)
.ContinueWith(response => { AddToDatabase(response.Result);}));
Another option is to leverage a method that takes a sequence of tasks and orders them by the order that they are completed. Here is my implementation of such a method:
public static IEnumerable<Task<T>> Order<T>(this IEnumerable<Task<T>> tasks)
{
var taskList = tasks.ToList();
var taskSources = new BlockingCollection<TaskCompletionSource<T>>();
var taskSourceList = new List<TaskCompletionSource<T>>(taskList.Count);
foreach (var task in taskList)
{
var newSource = new TaskCompletionSource<T>();
taskSources.Add(newSource);
taskSourceList.Add(newSource);
task.ContinueWith(t =>
{
var source = taskSources.Take();
if (t.IsCanceled)
source.TrySetCanceled();
else if (t.IsFaulted)
source.TrySetException(t.Exception.InnerExceptions);
else if (t.IsCompleted)
source.TrySetResult(t.Result);
}, CancellationToken.None, TaskContinuationOptions.PreferFairness, TaskScheduler.Default);
}
return taskSourceList.Select(tcs => tcs.Task);
}
Using this your code can become:
private async Task Run()
{
IEnumerable<Item> items = repo.GetItems(); // sync method to get list from database
foreach (var task in items.Select(item => GetFromWeb(item.url))
.Order())
{
await task.ConfigureAwait(false);
AddToDatabase(task.Result);
}
}
Just though I'd throw in my hat as well with the Rx solution
using System.Reactive;
using System.Reactive.Linq;
private Task Run()
{
var fromWebObservable = from item in repo.GetItems.ToObservable(Scheduler.Default)
select GetFromWeb(item.url);
fromWebObservable
.Select(async x => await x)
.Do(AddToDatabase)
.ToTask();
}

Speculative execution using the TPL

I have a List<Task<bool>> that I want to enumerate in parallel finding the first task to complete with a result of true and not waiting for or observe exceptions on any of the other tasks still pending.
var tasks = new List<Task<bool>>
{
Task.Delay(2000).ContinueWith(x => false),
Task.Delay(0).ContinueWith(x => true),
};
I have tried to use PLINQ to do something like:
var task = tasks.AsParallel().FirstOrDefault(t => t.Result);
Which executes in parallel, but doesn't return as soon as it finds a satisfying result. because accessing the Result property is blocking. In order for this to work using PLINQ, I'd have to write this aweful statement:
var cts = new CancellationTokenSource();
var task = tasks.AsParallel()
.FirstOrDefault(t =>
{
try
{
t.Wait(cts.Token);
if (t.Result)
{
cts.Cancel();
}
return t.Result;
}
catch (OperationCanceledException)
{
return false;
}
} );
I've written up an extension method that yields tasks as they complete like so.
public static class Exts
{
public static IEnumerable<Task<T>> InCompletionOrder<T>(this IEnumerable<Task<T>> source)
{
var tasks = source.ToList();
while (tasks.Any())
{
var t = Task.WhenAny(tasks);
yield return t.Result;
tasks.Remove(t.Result);
}
}
}
// and run like so
var task = tasks.InCompletionOrder().FirstOrDefault(t => t.Result);
But it feels like this is something common enough that there is a better way. Suggestions?
Maybe something like this?
var tcs = new TaskCompletionSource<Task<bool>>();
foreach (var task in tasks)
{
task.ContinueWith((t, state) =>
{
if (t.Result)
{
((TaskCompletionSource<Task<bool>>)state).TrySetResult(t);
}
},
tcs,
TaskContinuationOptions.OnlyOnRanToCompletion |
TaskContinuationOptions.ExecuteSynchronously);
}
var firstTaskToComplete = tcs.Task;
Perhaps you could try the Rx.Net library. Its very good for in effect providing Linq to Work.
Try this snippet in LinqPad after you reference the Microsoft Rx.Net assemblies.
using System
using System.Linq
using System.Reactive.Concurrency
using System.Reactive.Linq
using System.Reactive.Threading.Tasks
using System.Threading.Tasks
void Main()
{
var tasks = new List<Task<bool>>
{
Task.Delay(2000).ContinueWith(x => false),
Task.Delay(0).ContinueWith(x => true),
};
var observable = (from t in tasks.ToObservable()
//Convert task to an observable
let o = t.ToObservable()
//SelectMany
from x in o
select x);
var foo = observable
.SubscribeOn(Scheduler.Default) //Run the tasks on the threadpool
.ToList()
.First();
Console.WriteLine(foo);
}
First, I don't understand why are you trying to use PLINQ here. Enumerating a list of Tasks shouldn't take long, so I don't think you're going to gain anything from parallelizing it.
Now, to get the first Task that already completed with true, you can use the (non-blocking) IsCompleted property:
var task = tasks.FirstOrDefault(t => t.IsCompleted && t.Result);
If you wanted to get a collection of Tasks, ordered by their completion, have a look at Stephen Toub's article Processing tasks as they complete. If you want to list those that return true first, you would need to modify that code. If you don't want to modify it, you can use a version of this approach from Stephen Cleary's AsyncEx library.
Also, in the specific case in your question, you could “fix” your code by adding .WithMergeOptions(ParallelMergeOptions.NotBuffered) to the PLINQ query. But doing so still wouldn't work most of the time and can waste threads a lot even when it does. That's because PLINQ uses a constant number of threads and partitioning and using Result would block those threads most of the time.

Categories

Resources