This question already has answers here:
Parallel foreach with asynchronous lambda
(10 answers)
Closed 1 year ago.
I have a simple asynchronous code below : this is a WPF with one button and one textBox.
I used some list with five integers to mimic 5 different tasks.
My intention was to achieve that when I run all five tasks in parallel and asynchronously ,
I can observe that the numbers are one by one added to the textbox.
And I achieved it. Method "DoSomething" run all five tasks in parallel and each of the task has different execution time (simulated by Task.Delay) so all results in the numbers appearing in the textbox one by one.
The only problem that I cannot figure out is: why in the textbox I have the string text "This is end text" displayed at first ?! If I await method DoSomething then it should be accomplished first and then the rest of the code should be executed.Even though in my case is a repainting of the GUI.
I guess that this might be caused by the use of Dispacher.BeginInvoke which may "cause some disturbance " to async/await mechanism. But I would appreciate small clue and how to avoid this behawior.
I know that I could use the Progress event to achieve similar effect but is there any other way that I can use Parallel loop and update results progressively in WPF avoiding such a unexpected behaviour which I described?
private async void Button_Click(object sender, RoutedEventArgs e)
{
await DoSomething();
tbResults.Text += "This is end text";
}
private async Task DoSomething()
{
List<int> numbers = new List<int>(Enumerable.Range(1, 5));
await Task.Run(()=> Parallel.ForEach(numbers,async i =>
{
await Task.Delay(i * 300);
await Dispatcher.BeginInvoke(() => tbResults.Text += i.ToString() + Environment.NewLine);
}));
}
// output is:
//This is end text 1 2 3 4 5 (all in separatÄ™ lines).
My questions:
Why the text is displayed prior the method DoSomething .
How to solve it/ avoid it , any alternative way to solve it ( except using Progress event ).
Any info will be highly appreciate.
The threads of Parallel.Foreach are "real" background threads. They are created and the application continues execution. The point is that Parallel.Foreach is not awaitable, therefore the execution continues while the threads of the Parallel.Foreach are suspended using await.
private async Task DoSomething()
{
List<int> numbers = new List<int>(Enumerable.Range(1, 5));
// Create the threads of Parallel.Foreach
await Task.Run(() =>
{
// Create the threads of Parallel.Foreach and continue
Parallel.ForEach(numbers,async i =>
{
// await suspends the thread and forces to return.
// Because Parallel.ForEach is not awaitable,
// execution leaves the scope of the Parallel.Foreach to continue.
await Task.Delay(i * 300);
await Dispatcher.BeginInvoke(() => tbResults.Text += i.ToString() + Environment.NewLine);
});
// After the threads are created the internal await of the Parallel.Foreach suspends background threads and
// forces to the execution to return from the Parallel.Foreach.
// The Task.Run thread continues.
Dispatcher.InvokeAsync(() => tbResults.Text += "Text while Parallel.Foreach threads are suspended");
// Since the background threads of the Parallel.Foreach are not attached
// to the parent Task.Run, the Task.Run completes now and returns
// i.e. Task.run does not wait for child background threads to complete.
// ==> Leave Task.Run as there is no work.
});
// Leave DoSomething() and continue to execute the remaining code in the Button_Click().
// Parallel.Foreach threads is still suspended until the await chain, in this case Button_Click(), is completed.
}
The solution is to implement the pattern suggested by Clemens' comment or an async implementation of the Producer Consumer pattern using e.g., BlockingCollection or Channel to gain more control over the fixed number of threads while distributing the "unlimited" number of jobs.
private async Task DoSomething(int number)
{
await Task.Delay(number * 300);
Dispatcher.Invoke(() => tbResults.Text += number + Environment.NewLine);
}
private async void ButtonBase_OnClick(object sender, RoutedEventArgs e)
{
List<int> numbers = new List<int>(Enumerable.Range(1, 5));
List<Task> tasks = new List<Task>();
// Alternatively use LINQ Select
foreach (int number in numbers)
{
Task task = DoSomething(number);
tasks.Add(task);
}
await Task.WhenAll(tasks);
tbResults.Text += "This is end text" + Environment.NewLine;
}
Discussing the comments
"my intention was to run tasks in parallel and "report" once they are
completed i.e. the taks which takes the shortest would "report" first
and so on."
This is exactly what is happening in the above solution. The Task with the shortest delay appends text to the TextBox first.
"But implementing your suggested await Task.WhenAll(tasks) causes
that we need to wait for all tasks to complete and then report all at
once."
To process Task objects in their order of completion, you would replace Task.WhenAll with Task.WhenAny. In case you are not only interested in the first completed Task, you would have to use Task.WhenAny in an iterative manner until all Task instances have been completed:
Process all Task objects in their order of completion
private async Task DoSomething(int number)
{
await Task.Delay(number * 300);
Dispatcher.Invoke(() => tbResults.Text += number + Environment.NewLine);
}
private async void ButtonBase_OnClick(object sender, RoutedEventArgs e)
{
List<int> numbers = new List<int>(Enumerable.Range(1, 5));
List<Task> tasks = new List<Task>();
// Alternatively use LINQ Select
foreach (int number in numbers)
{
Task task = DoSomething(number);
tasks.Add(task);
}
// Until all Tasks have completed
while (tasks.Any())
{
Task<int> nextCompletedTask = await Task.WhenAny(tasks);
// Remove the completed Task so that
// we can await the next uncompleted Task that completes first
tasks.Remove(nextCompletedTask);
// Get the result of the completed Task
int taskId = await nextCompletedTask;
tbResults.Text += $"Task {taskId} has completed." + Environment.NewLine;
}
tbResults.Text += "This is end text" + Environment.NewLine;
}
"Parallel.ForEach is not awaitable so I thought that wrapping it up in
Task.Run allows me to await it but this is because as you said "Since
the background threads of the Parallel.Foreach are not attached to the
parent Task.Run""
No that's not exactly what I have said. The key point is the third sentence of my answer: "The point is that Parallel.Foreach is not awaitable, therefore the execution continues while the threads of the Parallel.Foreach are suspended using await.".
This means: normally Parallel.Foreach executes synchronously: the calling context continues execution when all threads of Parallel.Foreach have completed. But since you called await inside those threads, you suspend them in an async/await manner.
Since Parallel.Foreach is not awaitable, it can't handle the await calls and acts like the suspended threads have completed naturally. Parallel.Foreach does not understand that the threads are just suspended by await and will continue later. In other words, the await chain is broken as Parallel.Foreach is not able to return the Task to the parent awaited Task.Run context to signal its suspension.
That's what I meant when saying that the threads of Parallel.Foreach are not attached to the Task.Run. They run in total isolation from the async/await infrastructure.
"async lambdas should be "only use with events""
No, that's not correct. When you pass an async lambda to a void delegate like Action<T> you are correct: the async lambda can't be awaited in this case. But when passing an async lambda to a Func<T> delegate where T is of type Task, your lamda can be awaited:
private void NoAsyncDelegateSupportedMethod(Action lambda)
{
// Since Action does not return a Task (return type is always void),
// the async lambda can't be awaited
lambda.Invoke();
}
private async Task AsyncDelegateSupportedMethod(Func<Task> asyncLambda)
{
// Since Func returns a Task, the async lambda can be awaited
await asyncLambda.Invoke();
}
public voi DoSoemthing()
{
// Not a good idea as NoAsyncDelegateSupportedMethod can't handle async lamdas: it defines a void delegate
NoAsyncDelegateSupportedMethod(async () => await Task.Delay(1));
// A good idea as AsyncDelegateSupportedMethod can handle async lamdas: it defines a Func<Task> delegate
AsyncDelegateSupportedMethod(async () => await Task.Delay(1));
}
As you can see your statement is not correct. You must always check the signature of the called method and its overloads. If it accepts a Func<Task> type delegate you are good to go.
That's how async support is added to Parallel.ForeachAsync: the API supports a Func<ValueTask> type delegate. For example Task.Run accepts a Func<Task> and therefore the following call is perfectly fine:
Task.Run(async () => await Task.Delay(1));
" I guess that you admit that .Net 6.0 brought the best solution :
which is Parallel.ForEachASYNC! [...] We can spawn a couple of threads
which deal with our tasks in parallel and we can await the whole loop
and we do not need to wait for all tasks to complte- they "report" as
they finish "
That's wrong. Parallel.ForeachAsync supports threads thatr use async/await, that's true. Indeed, your original example would no longer break the intended flow: because Parallel.ForeachAsync supports await in its threads, it can handle suspended threads and propagate the Task object properly from its threads to the caller context e.g., to the wrapping await Task.Run.
It now knows how to wait for and resume suspended threads.
Important: Parallel.ForeachAsync still completes AFTER ALL its threads have completed. You assumption "they "report" as they finish" is wrong. That's the most intuitive concurrent implementation of a foreach. foreach also completes after all items are enumerated.
The solution to process Task objects as they complete is still using the Task.WhenAny pattern from above.
In general, if you you don't need the extra features like partitioning etc. of the the Parallel.Foreach and Parallel.ForeachAsync, you can always use Task.WhenAll instead. Task.WhenAll and especially Parallel.ForeachAsync are equivalent, except for Parallel.ForeachAsync provides greater customization by default: it suppors techniques like throttling and partitioning without the extra code.
Good links from Clemens see in comment. Answering your questions:
In Parallel.ForEach you start/fire for each entry of numbera an async task, which you don't await. So you do only await, that Parallel.ForEach does finish and it does finish before the async tasks of it.
What you could do e.g. remove async inside of Parallel.ForEach and use Dispatcher.Invoke instead of Dispatcher.BeginInvoke, Thread.Sleep is an anti-pattern ;) , so depending on your task may be take another solution(edited: BionicCode has a nice one):
private async Task DoSomething()
{
var numbers = new List<int>(Enumerable.Range(1, 5));
await Task.Run(()=> Parallel.ForEach(numbers, i =>
{
Thread.Sleep(i * 300);
Dispatcher.Invoke(() => tbResults.Text += i.ToString() + Environment.NewLine);
}));
}
Related
public static async void DoSomething(IEnumerable<IDbContext> dbContexts)
{
IEnumerator<IDbContext> dbContextEnumerator = dbContexts.GetEnumerator();
Task<ProjectSchema> projectSchemaTask = Task.Run(() => Core.Data.ProjectRead
.GetAll(dbContextEnumerator.Current)
.Where(a => a.PJrecid == pjRecId)
.Select(b => new ProjectSchema
{
PJtextid = b.PJtextid,
PJcustomerid = b.PJcustomerid,
PJininvoiceable = b.PJininvoiceable,
PJselfmanning = b.PJselfmanning,
PJcategory = b.PJcategory
})
.FirstOrDefault());
Task<int?> defaultActivitySchemeTask = projectSchemaTask.ContinueWith(antecedent =>
{
//This is where an exception may get thrown
return ProjectTypeRead.GetAll(dbContextEnumerator.Current)
.Where(a => a.PTid == antecedent.Result.PJcategory)
.Select(a => a.PTactivitySchemeID)
.FirstOrDefaultAsync().Result;
}, TaskContinuationOptions.OnlyOnRanToCompletion);
Task<SomeModel> customerTask = projectSchemaTask.ContinueWith((antecedent) =>
{
//This is where an exception may get thrown
return GetCustomerDataAsync(antecedent.Result.PJcustomerid,
dbContextEnumerator.Current).Result;
}, TaskContinuationOptions.OnlyOnRanToCompletion);
await Task.WhenAll(defaultActivitySchemeTask, customerTask);
}
The exception I am getting:
NotSupportedException: A second operation started on this context before a previous asynchronous operation completed. Use 'await' to ensure that any asynchronous operations have completed before calling another method on this context. Any instance members are not guaranteed to be thread safe.
The exception is only thrown about every 1/20 calls to this function. And the exception seems only to happen when I am chaining tasks with ContinueWith().
How can there be a second operation on context, when I am using a new one for each request?
This is just an example of my code. In the real code I have 3 parent tasks, and each parent has 1-5 chained tasks attached to them.
What am I doing wrong?
yeah, you basically shouldn't use ContinueWith these days; in this case, you are ending up with two continuations on the same task (for defaultActivitySchemeTask and customerTask); how they interact is now basically undefined, and will depend on exactly how the two async flows work, but you could absolutely end up with overlapping async operations here (for example, in the simplest "continuations are sequential", as soon as the first awaits because it is incomplete, the second will start). Frankly, this should be logically sequential await based code, probably not using Task.Run too, but let's keep it for now:
ProjectSchema projectSchema = await Task.Run(() => ...);
int? defaultActivityScheme = await ... first bit
SomeModel customer = await ... second bit
We can't do the two subordinate queries concurrently without risking concurrent async operations on the same context.
In your example you seem to be running two continuations in parallel, so there is a possibility that they will overlap causing a concurrency problem. DbContext is not thread safe, so you need to make sure that your asynchronous calls are sequential. Keep in mind that using async/await will simply turn your code into a state machine, so you can control which operations has completed before moving to the next operation. Using async methods alone will not ensure parallel operations but wrapping your operation in Task.Run will. So you you need to ask yourself is Task.Run is really required (i.e. is scheduling work in the ThreadPool) to make it parallel.
You mentioned that in your real code you have 3 parent tasks and each parent has 1-5 chained tasks attached to them. If the 3 parent tasks have separate DbContexts, they could run in parallel (each one of them wrapped in Task.Run), but their chained continuations need to be sequential (leveraging async/await keywords). Like that:
public async Task DoWork()
{
var parentTask1 = Task.Run(ParentTask1);
var parentTask2 = Task.Run(ParentTask2);
var parentTask3 = Task.Run(ParentTask3);
await Task.WhenAll(parentTask1 , parentTask2, parentTask3);
}
private async Task ParentTask1()
{
// chained child asynchronous continuations
await Task.Delay(100);
await Task.Delay(100);
}
private async Task ParentTask2()
{
// chained child asynchronous continuations
await Task.Delay(100);
await Task.Delay(100);
}
private async Task ParentTask3()
{
// chained child asynchronous continuations
await Task.Delay(100);
await Task.Delay(100);
}
If your parent tasks operate on the same DbContext, in order to avoid concurrency you would need to await them one by one (no need to wrap them into Task.Run):
public async Task DoWork()
{
await ParentTask1();
await ParentTask2();
await ParentTask3();
}
Task.WhenAll(IEnumerable<Task>) waits for all tasks in the IEnumerable are complete --- but only the tasks in the list when it's first called. If any active task adds to the list, they aren't considered. This short example demonstrates:
List<Task> _tasks = new List<Task>();
public async Task QuickExample()
{
for(int n =0; n < 6; ++n)
_tasks.Add(Func1(n));
await Task.WhenAll(_tasks);
Console.WriteLine("Some Tasks complete");
await Task.WhenAll(_tasks);
Console.WriteLine("All Tasks complete");
}
async Task Func1(int n)
{
Console.WriteLine($"Func1-{n} started");
await Task.Delay(2000);
if ((n % 3) == 1)
_tasks.Add(Func2(n));
Console.WriteLine($"Func1-{n} complete");
}
async Task Func2(int n)
{
Console.WriteLine($"Func2-{n} started");
await Task.Delay(2000);
Console.WriteLine($"Func2-{n} complete");
}
This outputs:
Func1-0 started
Func1-1 started
Func1-2 started
Func1-3 started
Func1-4 started
Func1-5 started
Func1-5 complete
Func1-3 complete
Func2-1 started
Func1-1 complete
Func1-0 complete
Func1-2 complete
Func2-4 started
Func1-4 complete
Some Tasks complete
Func2-4 complete
Func2-1 complete
All Tasks complete
Done
The second Task.WhenAll() solves the problem in this case, but that's a rather fragile solution. What's the best way to handle this in the general case?
You are modifying the List<> without locking it... You like to live a dangerous life :-) Save the Count of the _tasks before doing a WaitAll, then after the WaitAll check the Count of _tasks. If it is different, do another round (so you need a while around the WaitAll.
int count = _tasks.Count;
while (true)
{
await Task.WhenAll(_tasks);
lock (_tasks)
{
if (count == _tasks.Count)
{
Console.WriteLine("All Tasks complete");
break;
}
count = _tasks.Count;
Console.WriteLine("Some Tasks complete");
}
}
async Task Func1(int n)
{
Console.WriteLine($"Func1-{n} started");
await Task.Delay(2000);
if ((n % 3) == 1)
{
lock (_tasks)
{
_tasks.Add(Func2(n));
}
}
Console.WriteLine($"Func1-{n} complete");
}
I'll add a second (probably more correct solution), that is different from what you are doing: you could simply await the new Tasks from the Tasks that generated them, without cascading them to the _tasks collection. If A creates B, then A doesn't finish until B finishes. Clearly you don't need to add the new Tasks to the _tasks collection.
Asynchronous function will return to the caller on first await.
So for loop will be complete before you add extra tasks to original tasks list.
Implementation of Task.WhenAll will iterate/copy tasks to local list, so added tasks after Task.WhenAll called will be ignored.
In your particular case moving call to Func1 before await Task.Delay() could be a solution.
async Task Func1(int n)
{
Console.WriteLine($"Func1-{n} started");
if ((n % 3) == 1)
_tasks.Add(Func2(n));
await Task.Delay(2000);
Console.WriteLine($"Func1-{n} complete");
}
But if in real scenario calling of Func2 depend on result of some asynchronous method, then you need some other solution.
Since it seems that additional tasks can be created during the course of executing the original list of tasks, you will need a simple while construct.
while (_tasks.Any( t => !t.IsCompleted )
{
await Task.WhenAll(_tasks);
}
This will check the list for any uncompleted tasks and await them until it catches the list at a moment when there are no tasks left.
Consider this; it sounds like work is being submitted to the "Task List" from another thread. In a sense, the "task submission" thread itself could also be yet another Task for you to wait on.
If you wait for all Tasks to be submitted, then you are guaranteed that your next call to WhenAll will yield a fully-completed payload.
Your waiting function could/should be a two-step process:
Wait for the "Task Submitting" task to complete, signalling all Tasks are submitted
Wait for all the submitted tasks to complete.
Example:
public async Task WaitForAllSubmittedTasks()
{
// Work is being submitted in a background thread;
// Wrap that thread in a Task, and wait for it to complete.
var workScheduler = GetWorkScheduler();
await workScheduler;
// All tasks submitted!
// Now we get the completed list of all submitted tasks.
// It's important to note: the submitted tasks
// have been chugging along all this time.
// By the time we get here, there are probably a number of
// completed tasks already. It does not delay the speed
// or execution of our work items if we grab the List
// after some of the work has been completed.
//
// It's entirely possible that - by the time we call
// this function and wait on it - almost all the
// tasks have already been completed!
var submittedWork = GetAllSubmittedTasks();
await Task.WhenAll(submittedWork);
// Work complete!
}
I'm looping through an Array of values, for each value I want to execute a long running process. Since I have multiple tasks to be performed that have no inter dependency I want to be able to execute them in parallel.
My code is:
List<Task<bool>> dependantTasksQuery = new List<Task<bool>>();
foreach (int dependantID in dependantIDList)
{
dependantTasksQuery.Add(WaitForDependantObject(dependantID));
}
Task<bool>[] dependantTasks = dependantTasksQuery.ToArray();
//Wait for all dependant tasks to complete
bool[] lengths = await Task.WhenAll(dependantTasks);
The WaitForDependantObject method just looks like:
async Task<bool> WaitForDependantObject(int idVal)
{
System.Threading.Thread.Sleep(20000);
bool waitDone = true;
return waitDone;
}
As you can see I've just added a sleep to highlight my issue. What is happening when debugging is that on the line:
dependantTasksQuery.Add(WaitForDependantObject(dependantID));
My code is stopping and waiting the 20 seconds for the method to complete. I did not want to start the execution until I had completed the loop and built up the Array. Can somebody point me to what I'm doing wrong? I'm pretty sure I need an await somewhere
In your case WaitForDependantObject isn't asynchronous at all even though it returns a task. If that's your goal do as Luke Willis suggests. To make these calls both asynchronous and truly parallel you need to offload them to a Thread Pool thread with Task.Run:
bool[] lengths = await Task.WhenAll(dependantIDList.Select(() => Task.Run(() => WaitForDependantObject(dependantID))));
async methods run synchronously until an await is reached and them returns a task representing the asynchronous operation. In your case you don't have an await so the methods simply execute one after the other. Task.Run uses multiple threads to enable parallelism even on these synchronous parts on top of the concurrency of awaiting all the tasks together with Task.WhenAll.
For WaitForDependantObject to represent an async method more accurately it should look like this:
async Task<bool> WaitForDependantObject(int idVal)
{
await Task.Delay(20000);
return true;
}
Use Task.Delay to make method asynchronous and looking more real replacement of mocked code:
async Task<bool> WaitForDependantObject(int idVal)
{
// how long synchronous part of method takes (before first await)
System.Threading.Thread.Sleep(1000);
// method returns as soon as awiting started
await Task.Delay(2000); // how long IO or other async operation takes place
// simulate data processing, would run on new thread unless
// used in WPF/WinForms/ASP.Net and no call to ConfigureAwait(false) made by caller.
System.Threading.Thread.Sleep(1000);
bool waitDone = true;
return waitDone;
}
You can do this using Task.Factory.StartNew.
Replace this:
dependantTasksQuery.Add(WaitForDependantObject(dependantID));
with this:
dependantTasksQuery.Add(
Task.Factory.StartNew(
() => WaitForDependantObject(dependantID)
)
);
This will run your method within a new Task and add the task to your List.
You will also want to change the method signature of WaitForDependantObject to be:
bool WaitForDependantObject(int idVal)
You can then wait for your tasks to complete with:
Task.WaitAll(dependentTasksQuery.ToArray());
And get your results with:
bool[] lengths = dependentTasksQuery.Select(task => task.Result).ToArray();
I'm having some trouble getting a task to asynchronously delay. I am writing an application that needs to run at a scale of tens/hundreds of thousands of asynchronously executing scripts. I am doing this using C# Actions and sometimes, during the execution of a particular sequence, in order for the script to execute properly, it needs to wait on an external resource to reach an expected state. At first I wrote this using Thread.Sleep() but that turned out to be a torpedo in the applications performance, so I'm looking into async/await for async sleep. But I can't get it to actually wait on the pause! Can someone explain this?
static void Main(string[] args)
{
var sync = new List<Action>();
var async = new List<Action>();
var syncStopWatch = new Stopwatch();
sync.Add(syncStopWatch.Start);
sync.Add(() => Thread.Sleep(1000));
sync.Add(syncStopWatch.Stop);
sync.Add(() => Console.Write("Sync:\t" + syncStopWatch.ElapsedMilliseconds + "\n"));
var asyncStopWatch = new Stopwatch();
sync.Add(asyncStopWatch.Start);
sync.Add(async () => await Task.Delay(1000));
sync.Add(asyncStopWatch.Stop);
sync.Add(() => Console.Write("Async:\t" + asyncStopWatch.ElapsedMilliseconds + "\n"));
foreach (Action a in sync)
{
a.Invoke();
}
foreach (Action a in async)
{
a.Invoke();
}
}
The results of the execution are:
Sync: 999
Async: 2
How do I get it to wait asynchronously?
You're running into a problem with async void. When you pass an async lambda to an Action, the compiler is creating an async void method for you.
As a best practice, you should avoid async void.
One way to do this is to have your list of actions actually be a List<Func<Task>> instead of List<Action>. This allows you to queue async Task methods instead of async void methods.
This means your "execution" code would have to wait for each Task as it completes. Also, your synchronous methods would have to return Task.FromResult(0) or something like that so they match the Func<Task> signature.
If you want a bigger scope solution, I recommend you strongly consider TPL Dataflow instead of creating your own queue.
here is sample code for starting multiple task
Task.Factory.StartNew(() =>
{
//foreach (KeyValuePair<string, string> entry in dicList)
Parallel.ForEach(dicList,
entry =>
{
//create and add the Progress in UI thread
var ucProgress = (Progress)fpPanel.Invoke(createProgress, entry);
//execute ucProgress.Process(); in non-UI thread in parallel.
//the .Process(); must update UI by using *Invoke
ucProgress.Process();
System.Threading.Thread.SpinWait(5000000);
});
});
.ContinueWith(task =>
{
//to handle exceptions use task.Exception member
var progressBar = (ProgressBar)task.AsyncState;
if (!task.IsCancelled)
{
//hide progress bar here and reset pb.Value = 0
}
},
TaskScheduler.FromCurrentSynchronizationContext() //update UI from UI thread
);
when we start multiple task using Task.Factory.StartNew() then we can use .ContinueWith() block to determine when each task finish. i mean ContinueWith block fire once for each task completion. so i just want to know is there any mechanism in TPL library. if i start 10 task using Task.Factory.StartNew() so how do i notify after when 10 task will be finish. please give some insight with sample code.
if i start 10 task using Task.Factory.StartNew() so how do i notify after when 10 task will be finish
Three options:
The blocking Task.WaitAll call, which only returns when all the given tasks have completed
The async Task.WhenAll call, which returns a task which completes when all the given tasks have completed. (Introduced in .NET 4.5.)
TaskFactory.ContinueWhenAll, which adds a continuation task which will run when all the given tasks have completed.
if i start 10 task using Task.Factory.StartNew() so how do i notify after when 10 task will be finish
You can use Task.WaitAll. This call will block current thread until all tasks are finished.
Side note: you seem to be using Task, Parallel and Thread.SpinWait, which makes your code complex. I would spend a bit of time analysing if that complexity is really necessary.
You can use the WaitAll(). Example :
Func<bool> DummyMethod = () =>{
// When ready, send back complete!
return true;
};
// Create list of tasks
System.Threading.Tasks.Task<bool>[] tasks = new System.Threading.Tasks.Task<bool>[2];
// First task
var firstTask = System.Threading.Tasks.Task.Factory.StartNew(() => DummyMethod(), TaskCreationOptions.LongRunning);
tasks[0] = firstTask;
// Second task
var secondTask = System.Threading.Tasks.Task.Factory.StartNew(() => DummyMethod(), TaskCreationOptions.LongRunning);
tasks[1] = secondTask;
// Launch all
System.Threading.Tasks.Task.WaitAll(tasks);
Another solution:
After the completion of all the operation inside Parallel.For(...) it return an onject of ParallelLoopResult, Documentation:
For returns a System.Threading.Tasks.ParallelLoopResult object when
all threads have completed. This return value is useful when you are
stopping or breaking loop iteration manually, because the
ParallelLoopResult stores information such as the last iteration that
ran to completion. If one or more exceptions occur on one of the
threads, a System.AggregateException will be thrown.
The ParallelLoopResult class has a IsCompleted property that is set to false when a Stop() of Break() method has been executed.
Example:
ParallelLoopResult result = Parallel.For(...);
if (result.IsCompleted)
{
//Start another task
}
Note that it advised to use it only when breaking or stoping the loop manually (otherwise just use WaitAll, WhenAll etc).