How to constraint concurrency the right way in Rx.NET - c#

Please, observe the following code snippet:
var result = await GetSource(1000).SelectMany(s => getResultAsync(s).ToObservable()).ToList();
The problem with this code is that getResultAsync runs concurrently in an unconstrained fashion. Which could be not what we want in certain cases. Suppose I want to restrict its concurrency to at most 10 concurrent invocations. What is the Rx.NET way to do it?
I am enclosing a simple console application that demonstrates the subject and my lame solution of the described problem.
There is a bit extra code, like the Stats class and the artificial random sleeps. They are there to ensure I truly get concurrent execution and can reliably compute the max concurrency reached during the process.
The method RunUnconstrained demonstrates the naive, unconstrained run. The method RunConstrained shows my solution, which is not very elegant. Ideally, I would like to ease constraining the concurrency by simply applying a dedicated Rx operator to the Monad. Of course, without sacrificing the performance.
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Reactive.Linq;
using System.Reactive.Threading.Tasks;
using System.Threading;
using System.Threading.Tasks;
namespace RxConstrainedConcurrency
{
class Program
{
public class Stats
{
public int MaxConcurrentCount;
public int CurConcurrentCount;
public readonly object MaxConcurrentCountGuard = new object();
}
static void Main()
{
RunUnconstrained().GetAwaiter().GetResult();
RunConstrained().GetAwaiter().GetResult();
}
static async Task RunUnconstrained()
{
await Run(AsyncOp);
}
static async Task RunConstrained()
{
using (var sem = new SemaphoreSlim(10))
{
await Run(async (s, pause, stats) =>
{
// ReSharper disable AccessToDisposedClosure
await sem.WaitAsync();
try
{
return await AsyncOp(s, pause, stats);
}
finally
{
sem.Release();
}
// ReSharper restore AccessToDisposedClosure
});
}
}
static async Task Run(Func<string, int, Stats, Task<int>> getResultAsync)
{
var stats = new Stats();
var rnd = new Random(0x1234);
var result = await GetSource(1000).SelectMany(s => getResultAsync(s, rnd.Next(30), stats).ToObservable()).ToList();
Debug.Assert(stats.CurConcurrentCount == 0);
Debug.Assert(result.Count == 1000);
Debug.Assert(!result.Contains(0));
Debug.WriteLine("Max concurrency = " + stats.MaxConcurrentCount);
}
static IObservable<string> GetSource(int count)
{
return Enumerable.Range(1, count).Select(i => i.ToString()).ToObservable();
}
static Task<int> AsyncOp(string s, int pause, Stats stats)
{
return Task.Run(() =>
{
int cur = Interlocked.Increment(ref stats.CurConcurrentCount);
if (stats.MaxConcurrentCount < cur)
{
lock (stats.MaxConcurrentCountGuard)
{
if (stats.MaxConcurrentCount < cur)
{
stats.MaxConcurrentCount = cur;
}
}
}
try
{
Thread.Sleep(pause);
return int.Parse(s);
}
finally
{
Interlocked.Decrement(ref stats.CurConcurrentCount);
}
});
}
}
}

You can do this in Rx using the overload of Merge that constrains the number of concurrent subscriptions to inner observables.
This form of Merge is applied to a stream of streams.
Ordinarily, using SelectMany to invoke an async task from an event does two jobs: it projects each event into an observable stream whose single event is the result, and it flattens all the resulting streams together.
To use Merge we must use a regular Select to project each event into the invocation of an async task, (thus creating a stream of streams), and use Merge to flatten the result. It will do this in a constrained way by only subscribing to a supplied fixed number of the inner streams at any point in time.
We must be careful to only invoke each asynchronous task invocation upon subscription to the wrapping inner stream. Conversion of an async task to an observable with ToObservable() will actually call the async task immediately, rather than on subscription, so we must defer the evaluation until subscription using Observable.Defer.
Here's an example putting all these steps together:
void Main()
{
var xs = Observable.Range(0, 10); // source events
// "Double" here is our async operation to be constrained,
// in this case to 3 concurrent invocations
xs.Select(x =>
Observable.Defer(() => Double(x).ToObservable())).Merge(3)
.Subscribe(Console.WriteLine,
() => Console.WriteLine("Max: " + MaxConcurrent));
}
private static int Concurrent;
private static int MaxConcurrent;
private static readonly object gate = new Object();
public async Task<int> Double(int x)
{
var concurrent = Interlocked.Increment(ref Concurrent);
lock(gate)
{
MaxConcurrent = Math.Max(concurrent, MaxConcurrent);
}
await Task.Delay(TimeSpan.FromSeconds(1));
Interlocked.Decrement(ref Concurrent);
return x * 2;
}
The maximum concurrency output here will be "3". Remove the Merge to go "unconstrained" and you'll get "10" instead.
Another (equivalent) way of getting the Defer effect that reads a bit nicer is to use FromAsync instead of Defer + ToObservable:
xs.Select(x => Observable.FromAsync(() => Double(x))).Merge(3)

Related

Observable with backpressure in C#

Is there a way in C# rx to handle backpressure?
I'm trying to call a web api from the results of a paged query. This web api is very fragile and I need to not have more than say 3 concurrent calls, so, the program should be something like:
Feth a page from db
Call the web api with a maximum of three concurrent calls per each record on the page
Save the results back to db
Fetch another page and repeat until there are no more results.
I'm not really getting the sequence that I'm after, basically the db gets all the records regardless of whether they can be processed or not.
I've tried a variety of things including tweaking at the ObserveOn operator, implementing a semaphore, and a few other things. Could I get a little bit of guidance to implement something like this?
using System;
using System.Collections.Generic;
using System.Linq;
using System.Reactive.Concurrency;
using System.Reactive.Linq;
using System.Reactive.Threading.Tasks;
using System.Threading;
using System.Threading.Tasks;
using Castle.Core.Internal;
using Xunit;
using Xunit.Abstractions;
namespace ProductValidation.CLI.Tests.Services
{
public class Example
{
private readonly ITestOutputHelper output;
public Example(ITestOutputHelper output)
{
this.output = output;
}
[Fact]
public async Task RunsObservableToCompletion()
{
var repo = new Repository(output);
var client = new ServiceClient(output);
var results = repo.FetchRecords()
.Select(x => client.FetchMoreInformation(x).ToObservable())
.Merge(1)
.Do(async x => await repo.Save(x));
await results.LastOrDefaultAsync();
}
}
public class Repository
{
private readonly ITestOutputHelper output;
public Repository(ITestOutputHelper output)
{
this.output = output;
}
public IObservable<int> FetchRecords()
{
return Observable.Create<int>(async (observer) =>
{
var page = 1;
var products = await FetchPage(page);
while (!products.IsNullOrEmpty())
{
foreach (var product in products)
{
observer.OnNext(product);
}
page += 1;
products = await FetchPage(page);
}
observer.OnCompleted();
})
.ObserveOn(SynchronizationContext.Current);
}
private async Task<IEnumerable<int>> FetchPage(int page)
{
// Simulate fetching a paged query.
await Task.Delay(500).ToObservable().ObserveOn(new TaskPoolScheduler(new TaskFactory()));
output.WriteLine("Fetching page {0}", page);
if (page >= 4) return Enumerable.Empty<int>();
return Enumerable.Range(1, 3).Select(_ => page);
}
public async Task Save(string id)
{
await Task.Delay(50); //Simulates latency
}
}
public class ServiceClient
{
private readonly ITestOutputHelper output;
private readonly SemaphoreSlim semaphore;
public ServiceClient(ITestOutputHelper output)
{
this.output = output;
this.semaphore = new SemaphoreSlim(2);
}
public async Task<string> FetchMoreInformation(int id)
{
try
{
output.WriteLine("Calling the web client for {0}", id);
await semaphore.WaitAsync(); // Protection for the webapi not sending too many calls
await Task.Delay(1000); //Simulates latency
return id.ToString();
}
finally
{
semaphore.Release();
}
}
}
}
The Rx does not support backpressure, so there is no easy way to fetch the records from the DB at the same tempo that the records are processed. Maybe you could use a Subject<Unit> as a signaling mechanism, push a value every time a record is processed, and devise a way to use these signals at the producing site to fetch a new record from the DB when a signal is received. But it will be a messy and idiomatic solution. The TPL Dataflow is a more suitable tool than the Rx for doing this kind of work. It supports natively the BoundedCapacity configuration option.
Some comments regarding the code you've posted, that are not directly related to the backpressure issue:
The Merge operator with a maxConcurrent parameter imposes a limit on the concurrent subscriptions to the inner sequences, but this will have no effect in case the inner sequences are already up and running. So you have to ensure that the inner sequences are cold, and a handy way to do this is the Defer operator:
.Select(x => Observable.Defer(() =>
client.FetchMoreInformation(x).ToObservable()))
A more common way to convert asynchronous methods to deferred observable sequences is the FromAsync operator:
.Select(x => Observable.FromAsync(() => client.FetchMoreInformation(x)))
Btw the Do operator does not understand async delegates, so instead of:
.Do(async x => await repo.Save(x));
...which creates async void lambdas, it's better to do this:
.Select(x => Observable.FromAsync(() => repo.Save(x)))
.Merge(1);
Update: Here is an example of how you could use a SemaphoreSlim in order to implement backpressure in Rx:
const int boundedCapacity = 10;
using var semaphore = new SemaphoreSlim(boundedCapacity, boundedCapacity);
IObservable<int> results = repo
.FetchRecords(semaphore)
.Select(x => Observable.FromAsync(() => client.FetchMoreInformation(x)))
.Merge(1)
.Select(x => Observable.FromAsync(() => repo.Save(x)))
.Merge(1)
.Do(_ => semaphore.Release());
await results.DefaultIfEmpty();
And inside the FetchRecords method:
//...
await semaphore.WaitAsync();
observer.OnNext(product);
//...
This is a fragile solution, because it depends on propagating all elements through the pipeline. If in the future you decide to include filtering or throttling inside the pipeline, then the one-to-one relationship between WaitAsync and Release will be violated, with the most probable outcome being a deadlocked pipeline.

Many to Many TPL Dataflow does not process all inputs

I have a TPL Datalow pipeline with two sources and two targets linked in a many-to-many fashion. The target blocks appear to complete successfully, however, it usually drops one or more inputs. I've attached the simplest possible full repro I could come up with below. Any ideas?
Notes:
The problem only occurs if the artificial delay is used while generating the input.
Complete() is successfully called for both sources, but one of the source's Completion task hangs in the WaitingForActivation state, even though both Targets complete successfully.
I can't find any documentation stating many-to-many dataflows aren't supported, and this question's answer implies it is - https://social.msdn.microsoft.com/Forums/en-US/19d831af-2d3f-4d95-9672-b28ae53e6fa0/completion-of-complex-graph-dataflowgraph-object?forum=tpldataflow
using System;
using System.Diagnostics;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using System.Threading.Tasks.Dataflow;
class Program
{
private const int NumbersPerSource = 10;
private const int MaxDelayMilliseconds = 10;
static async Task Main(string[] args)
{
int numbersProcessed = 0;
var source1 = new BufferBlock<int>();
var source2 = new BufferBlock<int>();
var target1 = new ActionBlock<int>(i => Interlocked.Increment(ref numbersProcessed));
var target2 = new ActionBlock<int>(i => Interlocked.Increment(ref numbersProcessed));
var linkOptions = new DataflowLinkOptions() { PropagateCompletion = true };
source1.LinkTo(target1, linkOptions);
source1.LinkTo(target2, linkOptions);
source2.LinkTo(target1, linkOptions);
source2.LinkTo(target2, linkOptions);
var task1 = Task.Run(() => Post(source1));
var task2 = Task.Run(() => Post(source2));
// source1 or source2 Completion tasks may never complete even though Complete is always successfully called.
//await Task.WhenAll(task1, task2, source1.Completion, source2.Completion, target1.Completion, target2.Completion);
await Task.WhenAll(task1, task2, target1.Completion, target2.Completion);
Console.WriteLine($"{numbersProcessed} of {NumbersPerSource * 2} numbers processed.");
}
private static async Task Post(BufferBlock<int> source)
{
foreach (var i in Enumerable.Range(0, NumbersPerSource)) {
await Task.Delay(TimeSpan.FromMilliseconds(GetRandomMilliseconds()));
Debug.Assert(source.Post(i));
}
source.Complete();
}
private static Random Random = new Random();
private static int GetRandomMilliseconds()
{
lock (Random) {
return Random.Next(0, MaxDelayMilliseconds);
}
}
}
As #MikeJ pointed out in a comment, linking the blocks using the PropagateCompletion in a many-to-many dataflow configuration can cause the premature completion of some target blocks. In this case the target1 and target2 are both marked as completed when any of the two source blocks completes, leaving the other source unable to complete, because there are still messages in it's output buffer. These messages are never going to be consumed, because none of the linked target blocks is willing to accept them.
To fix this problem you could use the custom PropagateCompletion method below:
public static void PropagateCompletion(IDataflowBlock[] sources,
IDataflowBlock[] targets)
{
// Arguments validation omitted
Task allSourcesCompletion = Task.WhenAll(sources.Select(s => s.Completion));
ThreadPool.QueueUserWorkItem(async _ =>
{
try { await allSourcesCompletion.ConfigureAwait(false); } catch { }
Exception exception = allSourcesCompletion.IsFaulted ?
allSourcesCompletion.Exception : null;
foreach (var target in targets)
{
if (exception is null) target.Complete(); else target.Fault(exception);
}
});
}
Usage example:
source1.LinkTo(target1);
source1.LinkTo(target2);
source2.LinkTo(target1);
source2.LinkTo(target2);
PropagateCompletion(new[] { source1, source2 }, new[] { target1, target2 });
Notice that no DataflowLinkOptions are passed when linking the sources to the targets in this example.

Keep running specific number of task async

I'm currently working on a concurrent file downloader.
For that reason I want to parametrize the number of concurrent tasks. I don't want to wait for all the tasks to be completed but to keep the same number being runned.
In fact, this thread on star overflow gave me a proper clue, but I'm struggling making it async:
Keep running a specific number of tasks
Here is my code:
public async Task StartAsync()
{
var semaphore = new SemaphoreSlim(1, _concurrentTransfers);
var queueHasMessages = true;
while (queueHasMessages)
{
try {
await Task.Run(async () =>
{
await semaphore.WaitAsync();
await asyncStuff();
});
}
finally {
semaphore.Release();
};
}
}
But the code just get executed one at a time. I think that the await is blocking me for generating the desired amount of tasks, but I don't know how to avoid it while respecting the limit established by the semaphore.
If I add all the tasks to a list and make a whenall, the semaphore throws an exception since it has reached the max count.
Any suggestions?
It was brought to my attention that the struck-through solution will drop any exceptions that occur during execution. That's bad.
Here is a solution that will not drop exceptions:
Task.Run is a Factory Method for creating a Task. You can check yourself with the intellisense return value. You can assign the returned Task anywhere you like.
"await" is an operator that will wait until the task it operates on completes. You are able to use any Task with the await operator.
public static async Task RunTasksConcurrently()
{
IList<Task> tasks = new List<Task>();
for (int i = 1; i < 4; i++)
{
tasks.Add(RunNextTask());
}
foreach (var task in tasks) {
await task;
}
}
public static async Task RunNextTask()
{
while(true) {
await Task.Delay(500);
}
}
By adding the values of the Task we create to a list, we can await them later on in execution.
Previous Answer below
Edit: With the clarification I think I understand better.
Instead of running every task at once, you want to start 3 tasks, and as soon as a task is finished, run the next one.
I believe this can happen using the .ContinueWith(Action<Task>) method.
See if this gets closer to your intended solution.
public void SpawnInitialTasks()
{
for (int i = 0; i < 3; i++)
{
RunNextTask();
}
}
public void RunNextTask()
{
Task.Run(async () => await Task.Delay(500))
.ContinueWith(t => RunNextTask());
// Recurse here to keep running tasks whenever we finish one.
}
The idea is that we spawn 3 tasks right away, then whenever one finishes we spawn the next. If you need to keep data flowing between the tasks, you can use parameters:
RunNextTask(DataObject object)
You can do this easily the old-fashioned way without using await by using Parallel.ForEach(), which lets you specify the maximum number of concurrent threads to use.
For example:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace Demo
{
class Program
{
public static void Main(string[] args)
{
IEnumerable<string> filenames = Enumerable.Range(1, 100).Select(x => x.ToString());
Parallel.ForEach(
filenames,
new ParallelOptions { MaxDegreeOfParallelism = 4},
download
);
}
static void download(string filepath)
{
Console.WriteLine("Downloading " + filepath);
Thread.Sleep(1000); // Simulate downloading time.
Console.WriteLine("Downloaded " + filepath);
}
}
}
If you run this and observe the output, you'll see that the "files" are being "downloaded" in batchs.
A better simulation is the change download() so that it takes a random amount of time to process each "file", like so:
static Random rng = new Random();
static void download(string filepath)
{
Console.WriteLine("Downloading " + filepath);
Thread.Sleep(500 + rng.Next(1000)); // Simulate random downloading time.
Console.WriteLine("Downloaded " + filepath);
}
Try that and see the difference in the output.
However, if you want a more modern way to do this, you could look into the Dataflow part of the TPL (Task Parallel Library) - this works well with async methods.
This is a lot more complicated to get to grips with, but it's a lot more powerful. You could use an ActionBlock to do it, but describing how to do that is a bit beyond the scope of an answer I could give here.
Have a look at this other answer on StackOverflow; it gives a brief example.
Also note that the TPL is not built in to .Net - you have to get it from NuGet.

Await list of async predicates, but drop out on first false

Imagine the following class:
public class Checker
{
public async Task<bool> Check() { ... }
}
Now, imagine a list of instances of this class:
IEnumerable<Checker> checkers = ...
Now I want to control that every instance will return true:
checkers.All(c => c.Check());
Now, this won't compile, since Check() returns a Task<bool> not a bool.
So my question is: How can I best enumerate the list of checkers?
And how can I shortcut the enumeration as soon as a checker returns false?
(something I presume All( ) does already)
"Asynchronous sequences" can always cause some confusion. For example, it's not clear whether your desired semantics are:
Start all checks simultaneously, and evaluate them as they complete.
Start the checks one at a time, evaluating them in sequence order.
There's a third possibility (start all checks simultaneously, and evaluate them in sequence order), but that would be silly in this scenario.
I recommend using Rx for asynchronous sequences. It gives you a lot of options, and it a bit hard to learn, but it also forces you to think about exactly what you want.
The following code will start all checks simultaneously and evaluate them as they complete:
IObservable<bool> result = checkers.ToObservable()
.SelectMany(c => c.Check()).All(b => b);
It first converts the sequence of checkers to an observable, calls Check on them all, and checks whether they are all true. The first Check that completes with a false value will cause result to produce a false value.
In contrast, the following code will start the checks one at a time, evaluating them in sequence order:
IObservable<bool> result = checkers.Select(c => c.Check().ToObservable())
.Concat().All(b => b);
It first converts the sequence of checkers to a sequence of observables, and then concatenates those sequences (which starts them one at a time).
If you do not wish to use observables much and don't want to mess with subscriptions, you can await them directly. E.g., to call Check on all checkers and evaluate the results as they complete:
bool all = await checkers.ToObservable().SelectMany(c => c.Check()).All(b => b);
And how can I shortcut the enumeration as soon as a checker returns false?
This will check the tasks' result in order of completion. So if task #5 is the first to complete, and returns false, the method returns false immediately, regardless of the other tasks. Slower tasks (#1, #2, etc) would never be checked.
public static async Task<bool> AllAsync(this IEnumerable<Task<bool>> source)
{
var tasks = source.ToList();
while(tasks.Count != 0)
{
var finishedTask = await Task.WhenAny(tasks);
if(! finishedTask.Result)
return false;
tasks.Remove(finishedTask);
}
return true;
}
Usage:
bool result = await checkers.Select(c => c.Check())
.AllAsync();
All wasn't built with async in mind (like all LINQ), so you would need to implement that yourself:
async Task<bool> CheckAll()
{
foreach(var checker in checkers)
{
if (!await checker.Check())
{
return false;
}
}
return true;
}
You could make it more reusable with a generic extension method:
public static async Task<bool> AllAsync<TSource>(this IEnumerable<TSource> source, Func<TSource, Task<bool>> predicate)
{
foreach (var item in source)
{
if (!await predicate(item))
{
return false;
}
}
return true;
}
And use it like this:
var result = await checkers.AllAsync(c => c.Check());
You could do
checkers.All(c => c.Check().Result);
but that would run the tasks synchronously, which may be very slow depending on the implementation of Check().
Here's a fully functional test program, following in steps of dcastro:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace AsyncCheckerTest
{
public class Checker
{
public int Seconds { get; private set; }
public Checker(int seconds)
{
Seconds = seconds;
}
public async Task<bool> CheckAsync()
{
await Task.Delay(Seconds * 1000);
return Seconds != 3;
}
}
class Program
{
static void Main(string[] args)
{
var task = RunAsync();
task.Wait();
Console.WriteLine("Overall result: " + task.Result);
Console.ReadLine();
}
public static async Task<bool> RunAsync()
{
var checkers = new List<Checker>();
checkers
.AddRange(Enumerable.Range(1, 5)
.Select(i => new Checker(i)));
return await checkers
.Select(c => c.CheckAsync())
.AllAsync();
}
}
public static class ExtensionMethods
{
public static async Task<bool> AllAsync(this IEnumerable<Task<bool>> source)
{
var tasks = source.ToList();
while (tasks.Count != 0)
{
Task<bool> finishedTask = await Task.WhenAny(tasks);
bool checkResult = finishedTask.Result;
if (!checkResult)
{
Console.WriteLine("Completed at " + DateTimeOffset.Now + "...false");
return false;
}
Console.WriteLine("Working... " + DateTimeOffset.Now);
tasks.Remove(finishedTask);
}
return true;
}
}
}
Here's sample output:
Working... 6/27/2014 1:47:35 AM -05:00
Working... 6/27/2014 1:47:36 AM -05:00
Completed at 6/27/2014 1:47:37 AM -05:00...false
Overall result: False
Note that entire eval ended when exit condition was reached, without waiting for the rest to finish.
As a more out-of-the-box alternative, this seems to run the tasks in parallel and return shortly after the first failure:
var allResult = checkers
.Select(c => Task.Factory.StartNew(() => c.Check().Result))
.AsParallel()
.All(t => t.Result);
I'm not too hot on TPL and PLINQ so feel free to tell me what's wrong with this.

Parallel ForEach wait 500 ms before spawning

I have this situation:
var tasks = new List<ITask> ...
Parallel.ForEach(tasks, currentTask => currentTask.Execute() );
Is it possible to instruct PLinq to wait for 500ms before the next thread is spawned?
System.Threading.Thread.Sleep(5000);
You are using Parallel.Foreach totally wrong, You should make a special Enumerator that rate limits itself to getting data once every 500 ms.
I made some assumptions on how your DTO works due to you not providing any details.
private IEnumerator<SomeResource> GetRateLimitedResource()
{
SomeResource someResource = null;
do
{
someResource = _remoteProvider.GetData();
if(someResource != null)
{
yield return someResource;
Thread.Sleep(500);
}
} while (someResource != null);
}
here is how your paralell should look then
Parallel.ForEach(GetRateLimitedResource(), SomeFunctionToProcessSomeResource);
There are already some good suggestions. I would agree with others that you are using PLINQ in a manner it wasn't meant to be used.
My suggestion would be to use System.Threading.Timer. This is probably better than writing a method that returns an IEnumerable<> that forces a half second delay, because you may not need to wait the full half second, depending on how much time has passed since your last API call.
With the timer, it will invoke a delegate that you've provided it at the interval you specify, so even if the first task isn't done, a half second later it will invoke your delegate on another thread, so there won't be any extra waiting.
From your example code, it sounds like you have a list of tasks, in this case, I would use System.Collections.Concurrent.ConcurrentQueue to keep track of the tasks. Once the queue is empty, turn off the timer.
You could use Enumerable.Aggregate instead.
var task = tasks.Aggregate((t1, t2) =>
t1.ContinueWith(async _ =>
{ Thread.Sleep(500); return t2.Result; }));
If you don't want the tasks chained then there is also the overload to Select assuming the tasks are in order of delay.
var tasks = Enumerable
.Range(1, 10)
.Select(x => Task.Run(() => x * 2))
.Select((x, i) => Task.Delay(TimeSpan.FromMilliseconds(i * 500))
.ContinueWith(_ => x.Result));
foreach(var result in tasks.Select(x => x.Result))
{
Console.WriteLine(result);
}
From the comments a better options would be to guard the resource instead of using the time delay.
static object Locker = new object();
static int GetResultFromResource(int arg)
{
lock(Locker)
{
Thread.Sleep(500);
return arg * 2;
}
}
var tasks = Enumerable
.Range(1, 10)
.Select(x => Task.Run(() => GetResultFromResource(x)));
foreach(var result in tasks.Select(x => x.Result))
{
Console.WriteLine(result);
}
In this case how about a Producer-Consumer pattern with a BlockingCollection<T>?
var tasks = new BlockingCollection<ITask>();
// add tasks, if this is an expensive process, put it out onto a Task
// tasks.Add(x);
// we're done producin' (allows GetConsumingEnumerable to finish)
tasks.CompleteAdding();
RunTasks(tasks);
With a single consumer thread:
static void RunTasks(BlockingCollection<ITask> tasks)
{
foreach (var task in tasks.GetConsumingEnumerable())
{
task.Execute();
// this may not be as accurate as you would like
Thread.Sleep(500);
}
}
If you have access to .Net 4.5 you can use Task.Delay:
static void RunTasks(BlockingCollection<ITask> tasks)
{
foreach (var task in tasks.GetConsumingEnumerable())
{
Task.Delay(500)
.ContinueWith(() => task.Execute())
.Wait();
}
}

Categories

Resources