I'm following the pattern described at: https://blogs.msdn.microsoft.com/dotnet/2012/06/06/async-in-4-5-enabling-progress-and-cancellation-in-async-apis/ so I'm using IProgress.Report in my async method to report back the progress.
public async Task MyFirstTaskAsync(IProgress<T> progress)
{
progress.report(some T)
}
public async Task MySecondTaskAsync(IProgress<T> progress)
{
progress.report(some T)
}
On the UI thread, I'll call MyFirstTaskAsync and MySecondTaskAsync sequentially.
var reportProgress1 = new Action<T> (x =>Console.WriteLine("First Step Completed"))
var reportProgress2 = new Action<T> (x =>Console.WriteLine("Last Step Completed"))
var progress1 = new Progress<T>(reportProgress1 )
var progress2 = new Progress<T>(reportProgress2 )
await MyFirstTaskAsync(progress1)
Console.WriteLine("Second Step Comppleted")
await MySecondTaskAsync(progress2)
//code in reportProgress1.report can actually be executed at this point...
My problem here is code in reportProgress1.report can actually get executed after MyFirstTaskAsync has completed, which kinda messes up the progress report because I was expecting the first await can await until the progress report inside the first async method is also completed.
Is this a behavior I can somehow tweak to fit my need?
Thanks.
Edit: Suppose there're some non-async code between two tasks, and these three together are completing the overall process and each of them will update the progress as they run so they'll have to update the UI in a sequential order..
No, you can't "await" IProgress.Report. Progress reports are essentially fire-and-forget.
You can, however, use your own implementation of IProgress with appropriate semantics. In particular, Reactive Extensions is good for taming "streams of data" (in this case, streams of progress reports).
Related
This question already has answers here:
Parallel foreach with asynchronous lambda
(10 answers)
Closed 1 year ago.
I have a simple asynchronous code below : this is a WPF with one button and one textBox.
I used some list with five integers to mimic 5 different tasks.
My intention was to achieve that when I run all five tasks in parallel and asynchronously ,
I can observe that the numbers are one by one added to the textbox.
And I achieved it. Method "DoSomething" run all five tasks in parallel and each of the task has different execution time (simulated by Task.Delay) so all results in the numbers appearing in the textbox one by one.
The only problem that I cannot figure out is: why in the textbox I have the string text "This is end text" displayed at first ?! If I await method DoSomething then it should be accomplished first and then the rest of the code should be executed.Even though in my case is a repainting of the GUI.
I guess that this might be caused by the use of Dispacher.BeginInvoke which may "cause some disturbance " to async/await mechanism. But I would appreciate small clue and how to avoid this behawior.
I know that I could use the Progress event to achieve similar effect but is there any other way that I can use Parallel loop and update results progressively in WPF avoiding such a unexpected behaviour which I described?
private async void Button_Click(object sender, RoutedEventArgs e)
{
await DoSomething();
tbResults.Text += "This is end text";
}
private async Task DoSomething()
{
List<int> numbers = new List<int>(Enumerable.Range(1, 5));
await Task.Run(()=> Parallel.ForEach(numbers,async i =>
{
await Task.Delay(i * 300);
await Dispatcher.BeginInvoke(() => tbResults.Text += i.ToString() + Environment.NewLine);
}));
}
// output is:
//This is end text 1 2 3 4 5 (all in separatÄ™ lines).
My questions:
Why the text is displayed prior the method DoSomething .
How to solve it/ avoid it , any alternative way to solve it ( except using Progress event ).
Any info will be highly appreciate.
The threads of Parallel.Foreach are "real" background threads. They are created and the application continues execution. The point is that Parallel.Foreach is not awaitable, therefore the execution continues while the threads of the Parallel.Foreach are suspended using await.
private async Task DoSomething()
{
List<int> numbers = new List<int>(Enumerable.Range(1, 5));
// Create the threads of Parallel.Foreach
await Task.Run(() =>
{
// Create the threads of Parallel.Foreach and continue
Parallel.ForEach(numbers,async i =>
{
// await suspends the thread and forces to return.
// Because Parallel.ForEach is not awaitable,
// execution leaves the scope of the Parallel.Foreach to continue.
await Task.Delay(i * 300);
await Dispatcher.BeginInvoke(() => tbResults.Text += i.ToString() + Environment.NewLine);
});
// After the threads are created the internal await of the Parallel.Foreach suspends background threads and
// forces to the execution to return from the Parallel.Foreach.
// The Task.Run thread continues.
Dispatcher.InvokeAsync(() => tbResults.Text += "Text while Parallel.Foreach threads are suspended");
// Since the background threads of the Parallel.Foreach are not attached
// to the parent Task.Run, the Task.Run completes now and returns
// i.e. Task.run does not wait for child background threads to complete.
// ==> Leave Task.Run as there is no work.
});
// Leave DoSomething() and continue to execute the remaining code in the Button_Click().
// Parallel.Foreach threads is still suspended until the await chain, in this case Button_Click(), is completed.
}
The solution is to implement the pattern suggested by Clemens' comment or an async implementation of the Producer Consumer pattern using e.g., BlockingCollection or Channel to gain more control over the fixed number of threads while distributing the "unlimited" number of jobs.
private async Task DoSomething(int number)
{
await Task.Delay(number * 300);
Dispatcher.Invoke(() => tbResults.Text += number + Environment.NewLine);
}
private async void ButtonBase_OnClick(object sender, RoutedEventArgs e)
{
List<int> numbers = new List<int>(Enumerable.Range(1, 5));
List<Task> tasks = new List<Task>();
// Alternatively use LINQ Select
foreach (int number in numbers)
{
Task task = DoSomething(number);
tasks.Add(task);
}
await Task.WhenAll(tasks);
tbResults.Text += "This is end text" + Environment.NewLine;
}
Discussing the comments
"my intention was to run tasks in parallel and "report" once they are
completed i.e. the taks which takes the shortest would "report" first
and so on."
This is exactly what is happening in the above solution. The Task with the shortest delay appends text to the TextBox first.
"But implementing your suggested await Task.WhenAll(tasks) causes
that we need to wait for all tasks to complete and then report all at
once."
To process Task objects in their order of completion, you would replace Task.WhenAll with Task.WhenAny. In case you are not only interested in the first completed Task, you would have to use Task.WhenAny in an iterative manner until all Task instances have been completed:
Process all Task objects in their order of completion
private async Task DoSomething(int number)
{
await Task.Delay(number * 300);
Dispatcher.Invoke(() => tbResults.Text += number + Environment.NewLine);
}
private async void ButtonBase_OnClick(object sender, RoutedEventArgs e)
{
List<int> numbers = new List<int>(Enumerable.Range(1, 5));
List<Task> tasks = new List<Task>();
// Alternatively use LINQ Select
foreach (int number in numbers)
{
Task task = DoSomething(number);
tasks.Add(task);
}
// Until all Tasks have completed
while (tasks.Any())
{
Task<int> nextCompletedTask = await Task.WhenAny(tasks);
// Remove the completed Task so that
// we can await the next uncompleted Task that completes first
tasks.Remove(nextCompletedTask);
// Get the result of the completed Task
int taskId = await nextCompletedTask;
tbResults.Text += $"Task {taskId} has completed." + Environment.NewLine;
}
tbResults.Text += "This is end text" + Environment.NewLine;
}
"Parallel.ForEach is not awaitable so I thought that wrapping it up in
Task.Run allows me to await it but this is because as you said "Since
the background threads of the Parallel.Foreach are not attached to the
parent Task.Run""
No that's not exactly what I have said. The key point is the third sentence of my answer: "The point is that Parallel.Foreach is not awaitable, therefore the execution continues while the threads of the Parallel.Foreach are suspended using await.".
This means: normally Parallel.Foreach executes synchronously: the calling context continues execution when all threads of Parallel.Foreach have completed. But since you called await inside those threads, you suspend them in an async/await manner.
Since Parallel.Foreach is not awaitable, it can't handle the await calls and acts like the suspended threads have completed naturally. Parallel.Foreach does not understand that the threads are just suspended by await and will continue later. In other words, the await chain is broken as Parallel.Foreach is not able to return the Task to the parent awaited Task.Run context to signal its suspension.
That's what I meant when saying that the threads of Parallel.Foreach are not attached to the Task.Run. They run in total isolation from the async/await infrastructure.
"async lambdas should be "only use with events""
No, that's not correct. When you pass an async lambda to a void delegate like Action<T> you are correct: the async lambda can't be awaited in this case. But when passing an async lambda to a Func<T> delegate where T is of type Task, your lamda can be awaited:
private void NoAsyncDelegateSupportedMethod(Action lambda)
{
// Since Action does not return a Task (return type is always void),
// the async lambda can't be awaited
lambda.Invoke();
}
private async Task AsyncDelegateSupportedMethod(Func<Task> asyncLambda)
{
// Since Func returns a Task, the async lambda can be awaited
await asyncLambda.Invoke();
}
public voi DoSoemthing()
{
// Not a good idea as NoAsyncDelegateSupportedMethod can't handle async lamdas: it defines a void delegate
NoAsyncDelegateSupportedMethod(async () => await Task.Delay(1));
// A good idea as AsyncDelegateSupportedMethod can handle async lamdas: it defines a Func<Task> delegate
AsyncDelegateSupportedMethod(async () => await Task.Delay(1));
}
As you can see your statement is not correct. You must always check the signature of the called method and its overloads. If it accepts a Func<Task> type delegate you are good to go.
That's how async support is added to Parallel.ForeachAsync: the API supports a Func<ValueTask> type delegate. For example Task.Run accepts a Func<Task> and therefore the following call is perfectly fine:
Task.Run(async () => await Task.Delay(1));
" I guess that you admit that .Net 6.0 brought the best solution :
which is Parallel.ForEachASYNC! [...] We can spawn a couple of threads
which deal with our tasks in parallel and we can await the whole loop
and we do not need to wait for all tasks to complte- they "report" as
they finish "
That's wrong. Parallel.ForeachAsync supports threads thatr use async/await, that's true. Indeed, your original example would no longer break the intended flow: because Parallel.ForeachAsync supports await in its threads, it can handle suspended threads and propagate the Task object properly from its threads to the caller context e.g., to the wrapping await Task.Run.
It now knows how to wait for and resume suspended threads.
Important: Parallel.ForeachAsync still completes AFTER ALL its threads have completed. You assumption "they "report" as they finish" is wrong. That's the most intuitive concurrent implementation of a foreach. foreach also completes after all items are enumerated.
The solution to process Task objects as they complete is still using the Task.WhenAny pattern from above.
In general, if you you don't need the extra features like partitioning etc. of the the Parallel.Foreach and Parallel.ForeachAsync, you can always use Task.WhenAll instead. Task.WhenAll and especially Parallel.ForeachAsync are equivalent, except for Parallel.ForeachAsync provides greater customization by default: it suppors techniques like throttling and partitioning without the extra code.
Good links from Clemens see in comment. Answering your questions:
In Parallel.ForEach you start/fire for each entry of numbera an async task, which you don't await. So you do only await, that Parallel.ForEach does finish and it does finish before the async tasks of it.
What you could do e.g. remove async inside of Parallel.ForEach and use Dispatcher.Invoke instead of Dispatcher.BeginInvoke, Thread.Sleep is an anti-pattern ;) , so depending on your task may be take another solution(edited: BionicCode has a nice one):
private async Task DoSomething()
{
var numbers = new List<int>(Enumerable.Range(1, 5));
await Task.Run(()=> Parallel.ForEach(numbers, i =>
{
Thread.Sleep(i * 300);
Dispatcher.Invoke(() => tbResults.Text += i.ToString() + Environment.NewLine);
}));
}
In my current project, I have a piece of code that, after simplifying it down to where I'm having issues, looks something like this:
private async Task RunAsync(CancellationToken cancel)
{
bool finished = false;
while (!cancel.IsCancellationRequested && !finished)
finished = await FakeTask();
}
private Task<bool> FakeTask()
{
return Task.FromResult(false);
}
If I use this code without awaiting, I end up blocking anyway:
// example 1
var task = RunAsync(cancel); // Code blocks here...
... // Other code that could run while RunAsync is doing its thing, but is forced to wait
await task;
// example 2
var task = RunAsync(cancelSource.Token); // Code blocks here...
cancelSource.Cancel(); // Never called
In the actual project, I'm not actually using FakeTask, and there usually will be some Task.Delay I'm awaiting in there, so the code most of the time doesn't actually block, or only for a limited amount of iterations.
In unit testing, however, I'm using a mock object that does pretty much do what FakeTask does, so when I want to see if RunAsync responds to its CancellationToken getting cancelled the way I expect it to, I'm stuck.
I have found I can fix this issue by adding for example await Task.Delay(1) at the top of RunAsync, to force it to truly run asynchronous, but this feels a bit hacky. Are there better alternatives?
You have an incorrect mental picture of what await does. The meaning of await is:
Check to see if the awaitable object is complete. If it is, fetch its result and continue executing the coroutine.
If it is not complete, sign up the remainder of the current method as the continuation of the awaitable and suspend the coroutine by returning control to the caller. (Note that this makes it a semicoroutine.)
In your program, the "fake" awaitable is always complete, so there is never a suspension of the coroutine.
Are there better alternatives?
If your control flow logic requires you to suspend the coroutine then use Task.Yield.
Task.FromResult actually runs synchronously, as would await Task.Delay(0). If you want to actually simulate asynchronous code, call Task.Yield(). That creates an awaitable task that asynchronously yields back to the current context when awaited.
As #SLaks said, your code will run synchronously. One thing is running async code, and another thing is running parallel code.
If you need to run your code in parallel you can use Task.Run.
class Program
{
static async Task Main(string[] args)
{
var tcs = new CancellationTokenSource();
var task = Task.Run(() => RunAsync("1", tcs.Token));
var task2 = Task.Run(() => RunAsync("2", tcs.Token));
await Task.Delay(1000);
tcs.Cancel();
Console.ReadLine();
}
private static async Task RunAsync(string source, CancellationToken cancel)
{
bool finished = false;
while (!cancel.IsCancellationRequested && !finished)
finished = await FakeTask(source);
}
private static Task<bool> FakeTask(string source)
{
Console.WriteLine(source);
return Task.FromResult(false);
}
}
C#'s async methods execute synchronously up to the point where they have to wait for a result.
In your example there is no such point where the method has to wait for a result, so the loop keeps running forever and thereby blocking the caller.
Inserting an await Task.Yield() to simulate some real async work should help.
I have around 10 000 000 tasks that each takes from 1-10 seconds to complete. I am running those tasks on a powerful server, using 50 different threads, where each thread picks the first not-done task, runs it, and repeats.
Pseudo-code:
for i = 0 to 50:
run a new thread:
while True:
task = first available task
if no available tasks: exit thread
run task
Using this code, I can run all the tasks in parallell on any given number of threads.
In reality, the code uses C#'s Task.WhenAll, and looks like this:
ServicePointManager.DefaultConnectionLimit = threadCount; //Allow more HTTP request simultaneously
var currentIndex = -1;
var threads = new List<Task>(); //List of threads
for (int i = 0; i < threadCount; i++) //Generate the threads
{
var wc = CreateWebClient();
threads.Add(Task.Run(() =>
{
while (true) //Each thread should loop, picking the first available task, and executing it.
{
var index = Interlocked.Increment(ref currentIndex);
if (index >= tasks.Count) break;
var task = tasks[index];
RunTask(conn, wc, task, port);
}
}));
}
await Task.WhenAll(threads);
This works just as I wanted it to, but I have a problem: since this code takes a lot of time to run, I want the user to see some progress. The progress is displayed in a colored bitmap (representing a matrix), and also takes some time to generate (a few seconds).
Therefore, I want to generate this visualization on a background thread. But this other background thread is never executed. My suspicion is that it is using the same thread pool as the parallel code, and is therefore enqueued, and will not be executed before the parallel code is actually finished. (And that's a bit too late.)
Here's an example of how I generate the progress visualization:
private async void Refresh_Button_Clicked(object sender, RoutedEventArgs e)
{
var bitmap = await Task.Run(() => // <<< This task is never executed!
{
//bla, bla, various database calls, and generating a relatively large bitmap
});
//Convert the bitmap into a WPF image, and update the GUI
VisualizationImage = BitmapToImageSource(bitmap);
}
So, how could I best solve this problem? I could create a list of Tasks, where each Task represents one of my tasks, and run them with Parallel.Invoke, and pick another Thread pool (I think). But then I have to generate 10 million Task objects, instead of just 50 Task objects, running through my array of stuff to do. That sounds like it uses much more RAM than necessary. Any clever solutions to this?
EDIT:
As Panagiotis Kanavos suggested in one of his comments, I tried replacing some of my loop logic with ActionBlock, like this:
// Create an ActionBlock<int> that performs some work.
var workerBlock = new ActionBlock<ZoneTask>(
t =>
{
var wc = CreateWebClient(); //This probably generates some unnecessary overhead, but that's a problem I can solve later.
RunTask(conn, wc, t, port);
},
// Specify a maximum degree of parallelism.
new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = threadCount
});
foreach (var t in tasks) //Note: the objects in the tasks array are not Task objects
workerBlock.Post(t);
workerBlock.Complete();
await workerBlock.Completion;
Note: RunTask just executes a web request using the WebClient, and parses the results. It's nothing in there that can create a dead lock.
This seems to work as the old parallelism code, except that it needs a minute or two to do the initial foreach loop to post the tasks. Is this delay really worth it?
Nevertheless, my progress task still seems to be blocked. Ignoring the Progress< T > suggestion for now, since this reduced code still suffers the same problem:
private async void Refresh_Button_Clicked(object sender, RoutedEventArgs e)
{
Debug.WriteLine("This happens");
var bitmap = await Task.Run(() =>
{
Debug.WriteLine("This does not!");
//Still doing some work here, so it's not optimized away.
};
VisualizationImage = BitmapToImageSource(bitmap);
}
So it still looks like new tasks are not executed as long as the parallell task is running. I even reduced the "MaxDegreeOfParallelism" from 50 to 5 (on a 24 core server) to see if Peter Ritchie's suggestion was right, but no change. Any other suggestions?
ANOTHER EDIT:
The issue seems to have been that I overloaded the thread pool with all my simultaneous blocking I/O calls. I replaced WebClient with HttpClient and its async-functions, and now everything seems to be working nicely.
Thanks to everyone for the great suggestions! Even though not all of them directly solved the problem, I'm sure they all improved my code. :)
.NET already provides a mechanism to report progress with the IProgress< T> and the Progress< T> implementation.
The IProgress interface allows clients to publish messages with the Report(T) class without having to worry about threading. The implementation ensures that the messages are processed in the appropriate thread, eg the UI thread. By using the simple IProgress< T> interface the background methods are decoupled from whoever processes the messages.
You can find more information in the Async in 4.5: Enabling Progress and Cancellation in Async APIs article. The cancellation and progress APIs aren't specific to the TPL. They can be used to simplify cancellation and reporting even for raw threads.
Progress< T> processes messages on the thread on which it was created. This can be done either by passing a processing delegate when the class is instantiated, or by subscribing to an event. Copying from the article:
private async void Start_Button_Click(object sender, RoutedEventArgs e)
{
//construct Progress<T>, passing ReportProgress as the Action<T>
var progressIndicator = new Progress<int>(ReportProgress);
//call async method
int uploads=await UploadPicturesAsync(GenerateTestImages(), progressIndicator);
}
where ReportProgress is a method that accepts a parameter of int. It could also accept a complex class that reported work done, messages etc.
The asynchronous method only has to use IProgress.Report, eg:
async Task<int> UploadPicturesAsync(List<Image> imageList, IProgress<int> progress)
{
int totalCount = imageList.Count;
int processCount = await Task.Run<int>(() =>
{
int tempCount = 0;
foreach (var image in imageList)
{
//await the processing and uploading logic here
int processed = await UploadAndProcessAsync(image);
if (progress != null)
{
progress.Report((tempCount * 100 / totalCount));
}
tempCount++;
}
return tempCount;
});
return processCount;
}
This decouples the background method from whoever receives and processes the progress messages.
I'm looping through an Array of values, for each value I want to execute a long running process. Since I have multiple tasks to be performed that have no inter dependency I want to be able to execute them in parallel.
My code is:
List<Task<bool>> dependantTasksQuery = new List<Task<bool>>();
foreach (int dependantID in dependantIDList)
{
dependantTasksQuery.Add(WaitForDependantObject(dependantID));
}
Task<bool>[] dependantTasks = dependantTasksQuery.ToArray();
//Wait for all dependant tasks to complete
bool[] lengths = await Task.WhenAll(dependantTasks);
The WaitForDependantObject method just looks like:
async Task<bool> WaitForDependantObject(int idVal)
{
System.Threading.Thread.Sleep(20000);
bool waitDone = true;
return waitDone;
}
As you can see I've just added a sleep to highlight my issue. What is happening when debugging is that on the line:
dependantTasksQuery.Add(WaitForDependantObject(dependantID));
My code is stopping and waiting the 20 seconds for the method to complete. I did not want to start the execution until I had completed the loop and built up the Array. Can somebody point me to what I'm doing wrong? I'm pretty sure I need an await somewhere
In your case WaitForDependantObject isn't asynchronous at all even though it returns a task. If that's your goal do as Luke Willis suggests. To make these calls both asynchronous and truly parallel you need to offload them to a Thread Pool thread with Task.Run:
bool[] lengths = await Task.WhenAll(dependantIDList.Select(() => Task.Run(() => WaitForDependantObject(dependantID))));
async methods run synchronously until an await is reached and them returns a task representing the asynchronous operation. In your case you don't have an await so the methods simply execute one after the other. Task.Run uses multiple threads to enable parallelism even on these synchronous parts on top of the concurrency of awaiting all the tasks together with Task.WhenAll.
For WaitForDependantObject to represent an async method more accurately it should look like this:
async Task<bool> WaitForDependantObject(int idVal)
{
await Task.Delay(20000);
return true;
}
Use Task.Delay to make method asynchronous and looking more real replacement of mocked code:
async Task<bool> WaitForDependantObject(int idVal)
{
// how long synchronous part of method takes (before first await)
System.Threading.Thread.Sleep(1000);
// method returns as soon as awiting started
await Task.Delay(2000); // how long IO or other async operation takes place
// simulate data processing, would run on new thread unless
// used in WPF/WinForms/ASP.Net and no call to ConfigureAwait(false) made by caller.
System.Threading.Thread.Sleep(1000);
bool waitDone = true;
return waitDone;
}
You can do this using Task.Factory.StartNew.
Replace this:
dependantTasksQuery.Add(WaitForDependantObject(dependantID));
with this:
dependantTasksQuery.Add(
Task.Factory.StartNew(
() => WaitForDependantObject(dependantID)
)
);
This will run your method within a new Task and add the task to your List.
You will also want to change the method signature of WaitForDependantObject to be:
bool WaitForDependantObject(int idVal)
You can then wait for your tasks to complete with:
Task.WaitAll(dependentTasksQuery.ToArray());
And get your results with:
bool[] lengths = dependentTasksQuery.Select(task => task.Result).ToArray();
I'm having some trouble getting a task to asynchronously delay. I am writing an application that needs to run at a scale of tens/hundreds of thousands of asynchronously executing scripts. I am doing this using C# Actions and sometimes, during the execution of a particular sequence, in order for the script to execute properly, it needs to wait on an external resource to reach an expected state. At first I wrote this using Thread.Sleep() but that turned out to be a torpedo in the applications performance, so I'm looking into async/await for async sleep. But I can't get it to actually wait on the pause! Can someone explain this?
static void Main(string[] args)
{
var sync = new List<Action>();
var async = new List<Action>();
var syncStopWatch = new Stopwatch();
sync.Add(syncStopWatch.Start);
sync.Add(() => Thread.Sleep(1000));
sync.Add(syncStopWatch.Stop);
sync.Add(() => Console.Write("Sync:\t" + syncStopWatch.ElapsedMilliseconds + "\n"));
var asyncStopWatch = new Stopwatch();
sync.Add(asyncStopWatch.Start);
sync.Add(async () => await Task.Delay(1000));
sync.Add(asyncStopWatch.Stop);
sync.Add(() => Console.Write("Async:\t" + asyncStopWatch.ElapsedMilliseconds + "\n"));
foreach (Action a in sync)
{
a.Invoke();
}
foreach (Action a in async)
{
a.Invoke();
}
}
The results of the execution are:
Sync: 999
Async: 2
How do I get it to wait asynchronously?
You're running into a problem with async void. When you pass an async lambda to an Action, the compiler is creating an async void method for you.
As a best practice, you should avoid async void.
One way to do this is to have your list of actions actually be a List<Func<Task>> instead of List<Action>. This allows you to queue async Task methods instead of async void methods.
This means your "execution" code would have to wait for each Task as it completes. Also, your synchronous methods would have to return Task.FromResult(0) or something like that so they match the Func<Task> signature.
If you want a bigger scope solution, I recommend you strongly consider TPL Dataflow instead of creating your own queue.