How do I exit from a C# GraphQL subscription? - c#

I am subscribing to a GraphQL source using this code:
public override async Task RunAsync(RunMode mode, CancellationToken cancel)
{
var consumer = new XYZConsumer(new GraphQLHttpClient(_options.Url, new NewtonsoftJsonSerializer()));
var result = await consumer.GetAllXYZ();
result.Subscribe(t =>
{
****How_do_I_get_out_of_here?*****
...
public class XYZConsumer
{
private IObservable<GraphQL.GraphQLResponse<SubscriptionApi>> subscriptionStream;
private readonly IGraphQLClient _client;
public XYZConsumer(IGraphQLClient client)
{
_client = client;
}
public async Task<IObservable<GraphQL.GraphQLResponse<SubscriptionApi>>> GetAllXYZ()
{
var posSubscription = new GraphQL.GraphQLRequest
{
Query = #"subscription mquery..."
};
subscriptionStream = _client.CreateSubscriptionStream<SubscriptionApi>(posSubscription);
return subscriptionStream;
}
}
I have a couple of possibly related questions:
GetAllXYZ is highlighted by the compiler, with the message This async method lacks 'await' operators and will run synchronously. Consider using the 'await' operator ... Solutions I have found suggest I should be returing a task rather than subscriptionStream but I need to return that, so how can I change this?
I get stuck in the subscribe loop. How can I exit at How_do_I_get_out_of_here? when say 100 records have been read?

Add using System.Reactive.Linq; and Take limits it to e.g. 20 records:
result
.Take(20)
.Subscribe(t =>

Related

How to have only one thread for fire and forget task in asp.net webapi?

I have a need in my asp.net webapi (framework .Net 4.7.2) to call Redis (using StackExchange.Redis) in order to delete a key in a fire and forget way and I am making some stress test.
As I am comparing the various way to have the max speed :
I have already test executing the command with the FireAndForget flag,
I have also measured a simple command to Redis by await it.
And I am now searching a way to collect a list of commands received in a window of 15ms and execute them all in one go by pipeling them.
I have first try to use a Task.Run Action to call Redis but the problem that I am observing is that under stress, the memory of my webapi keep climbing.
The memory is full of System.Threading.IThreadPoolWorkItem[] objects with the folowing code :
[HttpPost]
[Route("api/values/testpostfireforget")]
public ApiResult<int> DeleteFromBasketId([FromBody] int basketId)
{
var response = new DeleteFromBasketResponse<int>();
var cpt = Interlocked.Increment(ref counter);
Task.Run(async () => {
await db.StringSetAsync($"BASKET_TO_DELETE_{cpt}",cpt.ToString())
.ConfigureAwait(false);
});
return response;
}
So I think that under stress my api keep enqueing background task in memory and execute them one after the other as fast as it can but less than the request coming in...
So I am searching for a way to have only one long lived background thread running with the asp.net webapi, that could capture the commands to send to Redis and execute them by pipeling them.
I was thinking in runnning a background task by implementing IHostedService interface, but it seems that in this case the background task would not share any state with my current http request. So implementing a IhostedService would be handy for a scheduled background task but not in my case, or I do not know how...
Based on StackExchange.Redis documentation you can use CommandFlags.FireAndForget flag:
[HttpPost]
[Route("api/values/testpostfireforget")]
public ApiResult<int> DeleteFromBasketId([FromBody] int basketId)
{
var response = new DeleteFromBasketResponse<int>();
var cpt = Interlocked.Increment(ref counter);
db.StringSet($"BASKET_TO_DELETE_{cpt}", cpt.ToString(), flags: CommandFlags.FireAndForget);
return response;
}
Edit 1: another solution based on comment
You can use pub/sub approach. Something like this should work:
public class MessageBatcher
{
private readonly IDatabase target;
private readonly BlockingCollection<Action<IDatabaseAsync>> tasks = new();
private Task worker;
public MessageBatcher(IDatabase target) => this.target = target;
public void AddMessage(Action<IDatabaseAsync> task) => tasks.Add(task);
public IDisposable Start(int batchSize)
{
var cancellationTokenSource = new CancellationTokenSource();
worker = Task.Factory.StartNew(state =>
{
var count = 0;
var tokenSource = (CancellationTokenSource) state;
var box = new StrongBox<IBatch>(target.CreateBatch());
tokenSource.Token.Register(b => ((StrongBox<IBatch>)b).Value.Execute(), box);
foreach (var task in tasks.GetConsumingEnumerable(tokenSource.Token))
{
var batch = box.Value;
task(batch);
if (++count == batchSize)
{
batch.Execute();
box.Value = target.CreateBatch();
count = 0;
}
}
}, cancellationTokenSource, cancellationTokenSource.Token, TaskCreationOptions.LongRunning, TaskScheduler.Current);
return new Disposer(worker, cancellationTokenSource);
}
private class Disposer : IDisposable
{
private readonly Task worker;
private readonly CancellationTokenSource tokenSource;
public Disposer(Task worker, CancellationTokenSource tokenSource) => (this.worker, this.tokenSource) = (worker, tokenSource);
public void Dispose()
{
tokenSource.Cancel();
worker.Wait();
tokenSource.Dispose();
}
}
}
Usage:
private readonly MessageBatcher batcher;
ctor(MessageBatcher batcher) // ensure that passed `handler` is singleton and already already started
{
this.batcher= batcher;
}
[HttpPost]
[Route("api/values/testpostfireforget")]
public ApiResult<int> DeleteFromBasketId([FromBody] int basketId)
{
var response = new DeleteFromBasketResponse<int>();
var cpt = Interlocked.Increment(ref counter);
batcher.AddMessage(db => db.StringSetAsync($"BASKET_TO_DELETE_{cpt}", cpt.ToString(), flags: CommandFlags.FireAndForget));
return response;
}

How can I get the result of my work in a Task

I need to do a work in a Task (infinite loop for monitoring) but how can I get the result of this work?
My logic to do this stuff i wrong? This is a scope problem I think.
There is an example simplified:
The variable is "first" and I want "edit"
namespace my{
public class Program{
public static void Main(string[] args){
Logic p = new Logic();
Task t = new Task(p.process);
t.Start();
Console.WriteLine(p.getVar());// result="first"
}
}
public class Logic{
public string test = "first";
public void process(){
while(true){
//If condition here
this.test = "edit";
}
}
public String getVar(){
return this.test;
}
}
}
It can be done using custom event. In your case it can be something like:
public event Action<string> OnValueChanged;
Then attach to it
p.OnValueChanged += (newValue) => Console.WriteLine(newValue);
And do not forget to fire it
this.test = "edit";
OnValueChanged?.Invoke(this.test);
Tasks aren't threads, they don't need a .Start call to start them. All examples and tutorials show the use of Task.Run or Task.StartNew for a reason - tasks are a promise that a function will execute at some point in the future and produce a result. They will run on threads pulled from a ThreadPool when a Task Scheduler decides they should. Creating cold tasks and calling .Start doesn't guarantee they will start, it simply makes the code a lot more difficult to read.
In the simplest case, polling eg a remote HTTP endpoint could be as simple as :
public static async Task Main()
{
var client=new HttpClient(serverUrl);
while(true)
{
var response=await client.GetAsync(relativeServiceUrl);
if(!response.IsSuccessStatusCode)
{
//That was an error, do something with it
}
await Task.Delay(1000);
}
}
There's no need to start a new Task because GetAsync is asynchronous. WCF and ADO.NET also provide asynchronous execution methods.
If there's no asynchronous method to call, or if we need to perform some heavey work before the async call, we can use Task.Run to start a method in parallel and await for it to finish:
public bool CheckThatService(string serviceUrl)
{
....
}
public static async Task Main()
{
var url="...";
//...
while(true)
{
var ok=Task.Run(()=>CheckThatService(url));
if(!ok)
{
//That was an error, do something with it
}
await Task.Delay(1000);
}
}
What if we want to test multiple systems in parallel? We can start multiple tasks in parallel, await all of them to complete and check their results:
public static async Task Main()
{
var urls=new[]{"...","..."};
//...
while(true)
{
var tasks=urls.Select(url=>Task.Run(()=>CheckThatService(url));
var responses=await Task.WhenAll(tasks);
foreach(var response in responses)
{
///Check the value, due something
}
await Task.Delay(1000);
}
}
Task.WhenAll returns an array with the results in the order the tasks were created. This allows checking the index to find the original URL. A better idea would be to return the result and url together, eg using tuples :
public static (bool ok,string url) CheckThatService(string serviceUrl)
{
....
return (true,url);
}
The code wouldn't change a lot:
var tasks=urls.Select(url=>Task.Run(()=>CheckThatService(url));
var responses=await Task.WhenAll(tasks);
foreach(var response in responses.Where(resp=>!resp.ok))
{
///Check the value, due something
}
What if we wanted to store the results from all the calls? We can't use a List or Queue because they aren't thread safe. We can use a ConcurrentQueue instead:
ConcurrentQueue<string> _results=new ConcurrentQueue<string>();
public static (bool ok,string url) CheckThatService(string serviceUrl)
{
....
_results.Enqueue(someresult);
return (true,url);
}
If we want to report progress regularly we can use IProgress<T> as shown in Enabling Progress and Cancellation in Async APIs.
We could put all the monitoring code in a separate method/class that accepts an IProgress< T> parameter with a progress object that can report success, error messages and the URL that caused them, eg :
class MonitorDTO
{
public string Url{get;set;}
public bool Success{get;set;}
public string Message{get;set;}
public MonitorDTO(string ulr,bool success,string msg)
{
//...
}
}
class MyMonitor
{
string[] _urls=url;
public MyMonitor(string[] urls)
{
_urls=url;
}
public Task Run(IProgress<MonitorDTO> progress)
{
while(true)
{
var ok=Task.Run(()=>CheckThatService(url));
if(!ok)
{
_progress.Report(new MonitorDTO(ok,url,"some message");
}
await Task.Delay(1000);
}
}
}
This class could be used in this way:
public static async Task Maim()
{
var ulrs=new[]{....};
var monitor=new MyMonitor(urls);
var progress=new Progress<MonitorDTO>(pg=>{
Console.WriteLine($"{pg.Success} for {pg.Url}: {pg.Message}");
});
await monitor.Run(progress);
}
Enabling Progress and Cancellation in Async APIs shows how to use the CancellationTokenSource to implement another important part of a monitoring class - cancelling it. The monitoring method could check the status of a cancellation token periodically and stop monitoring when it's raised:
public Task Run(IProgress<MonitorDTO> progress,CancellationToken ct)
{
while(!ct.IsCancellationRequested)
{
//...
}
}
public static async Task Maim()
{
var ulrs=new[]{....};
var monitor=new MyMonitor(urls);
var progress=new Progress<MonitorDTO>(pg=>{
Console.WriteLine($"{pg.Success} for {pg.Url}: {pg.Message}");
});
var cts = new CancellationTokenSource();
//Not awaiting yet!
var monitorTask=monitor.Run(progress,cts.Token);
//Keep running until the first keypress
Console.ReadKey();
//Cancel and wait for the monitoring class to gracefully stop
cts.Cancel();
await monitorTask;
In this case the loop will exit when the CancellationToken is raised. By not awaiting on MyMonitor.Run() we can keep working on the main thread until an event occurs that signals monitoring should stop.
The getVar method is executed before the process method.
Make sure that you wait until your task is finished before you call the getVar method.
Logic p = new Logic();
Task t = new Task(p.process);
t.Start();
t.Wait(); // Add this line!
Console.WriteLine(p.getVar());
If you want to learn more about the Wait method, please check this link.

Unit Testing Cache Behaviour of Akavache with TestScheduler

So I'm trying to test caching behaviour in an app that's using Akavache.
My test looks like this:
using Akavache;
using Microsoft.Reactive.Testing;
using Moq;
using NUnit.Framework;
using ReactiveUI.Testing;
using System;
using System.Threading.Tasks;
[TestFixture]
public class CacheFixture
{
[Test]
public async Task CachingTest()
{
var scheduler = new TestScheduler();
// replacing the TestScheduler with the scheduler below works
// var scheduler = CurrentThreadScheduler.Instance;
var cache = new InMemoryBlobCache(scheduler);
var someApi = new Mock<ISomeApi>();
someApi.Setup(s => s.GetSomeStrings())
.Returns(Task.FromResult("helloworld")).Verifiable();
var apiWrapper = new SomeApiWrapper(someApi.Object, cache,
TimeSpan.FromSeconds(10));
var string1 = await apiWrapper.GetSomeStrings();
someApi.Verify(s => s.GetSomeStrings(), Times.Once());
StringAssert.AreEqualIgnoringCase("helloworld", string1);
scheduler.AdvanceToMs(5000);
// without the TestScheduler, I'd have to 'wait' here
// await Task.Delay(5000);
var string2 = await apiWrapper.GetSomeStrings();
someApi.Verify(s => s.GetSomeStrings(), Times.Once());
StringAssert.AreEqualIgnoringCase("helloworld", string2);
}
}
The SomeApiWrapper uses an internal api (mocked with new Mock<ISomeApi>()) that - for simplicity's sake - just returns a string. The problem now is that the second string is never returned. The SomeApiWrapper class that handles the caching looks like this:
using Akavache;
using System;
using System.Reactive.Linq;
using System.Threading.Tasks;
public class SomeApiWrapper
{
private IBlobCache Cache;
private ISomeApi Api;
private TimeSpan Timeout;
public SomeApiWrapper(ISomeApi api, IBlobCache cache, TimeSpan cacheTimeout)
{
Cache = cache;
Api = api;
Timeout = cacheTimeout;
}
public async Task<string> GetSomeStrings()
{
var key = "somestrings";
var cachedStrings = Cache.GetOrFetchObject(key, DoGetStrings,
Cache.Scheduler.Now.Add(Timeout));
// this is the last step, after this it just keeps running
// but never returns - but only for the 2nd call
return await cachedStrings.FirstOrDefaultAsync();
}
private async Task<string> DoGetStrings()
{
return await Api.GetSomeStrings();
}
}
Debugging only leads me to the line return await cachedStrings.FirstOrDefaultAsync(); - and it never finishes after that.
When I replace the TestScheduler with the standard (CurrentThreadScheduler.Instance) and the scheduler.AdvanceToMs(5000) with await Task.Delay(5000), everything works as expected but I don't want unit tests running for multiple seconds.
A similar test, where the TestScheduler is advanced past the cache timeout also succeeds. It's just this scenario, where the cache entry should not expire in between the two method calls.
Is there something I'm doing wrong in the way I'm using TestScheduler?
This is a fairly common problem when bouncing between the Task and the IObservable paradigms. It is further exacerbated by trying to wait before moving forward in the tests.
The key problem is that you are blocking* here
return await cachedStrings.FirstOrDefaultAsync();
I say blocking in the sense that the code can not continue to process until this statement yields.
On the first run the cache can not find the key, so it executes your DoGetStrings. The issue surfaces on the second run, where the cache is populated. This time (I guess) the fetching of the cached data is scheduled. You need to invoke the request, observe the sequence, then pump the scheduler.
The corrected code is here (but requires some API changes)
[TestFixture]
public class CacheFixture
{
[Test]
public async Task CachingTest()
{
var testScheduler = new TestScheduler();
var cache = new InMemoryBlobCache(testScheduler);
var cacheTimeout = TimeSpan.FromSeconds(10);
var someApi = new Mock<ISomeApi>();
someApi.Setup(s => s.GetSomeStrings())
.Returns(Task.FromResult("helloworld")).Verifiable();
var apiWrapper = new SomeApiWrapper(someApi.Object, cache, cacheTimeout);
var string1 = await apiWrapper.GetSomeStrings();
someApi.Verify(s => s.GetSomeStrings(), Times.Once());
StringAssert.AreEqualIgnoringCase("helloworld", string1);
testScheduler.AdvanceToMs(5000);
var observer = testScheduler.CreateObserver<string>();
apiWrapper.GetSomeStrings().Subscribe(observer);
testScheduler.AdvanceByMs(cacheTimeout.TotalMilliseconds);
someApi.Verify(s => s.GetSomeStrings(), Times.Once());
StringAssert.AreEqualIgnoringCase("helloworld", observer.Messages[0].Value.Value);
}
}
public interface ISomeApi
{
Task<string> GetSomeStrings();
}
public class SomeApiWrapper
{
private IBlobCache Cache;
private ISomeApi Api;
private TimeSpan Timeout;
public SomeApiWrapper(ISomeApi api, IBlobCache cache, TimeSpan cacheTimeout)
{
Cache = cache;
Api = api;
Timeout = cacheTimeout;
}
public IObservable<string> GetSomeStrings()
{
var key = "somestrings";
var cachedStrings = Cache.GetOrFetchObject(key, DoGetStrings,
Cache.Scheduler.Now.Add(Timeout));
//Return an observerable here instead of "blocking" with a task. -LC
return cachedStrings.Take(1);
}
private async Task<string> DoGetStrings()
{
return await Api.GetSomeStrings();
}
}
This code is green and runs sub-second.

Call async await method in sync calling method

This is only the idea on what im doing in a window service.
I get the idea from this video to do it in parallel processing.
I have two different method and a model class.
Model Class code:
public class Email(){
public string Recipient { get; set; }
public string Message { get; set; }
}
Methods is something like this:
public void LoadData(){
while(Main.IsProcessRunning){
// 1. Get All Emails
var emails = new dummyRepositories().GetAllEmails(); //This will return List<Emails>.
// 2. Send it
// After sending assume that the data will move to other table so it will not be query again for the next loop.
SendDataParallel(emails);//this will function async? even though the calling method is sync.
// This will continue here or wait until it already send?
// If it will continue here even though it will not send already
// So there's a chance to get the email again for the next loop and send it again?
}
}
//This will send email at parallel
public async void SendDataParallel(IList<Email> emails){
var allTasks = emails.Select(SendDataAsync);
await TaskEx.WhenAll(allTasks);
}
//Assume this code will send email asynchronously. (this will not send email, for sample only)
public async void SendDataAsync(Email email){
using (var client = new HttpClient())
{
client.PostAsync(email);
}
}
I only want to get all queued emails then send it in parallel then wait until it already send.
I'm avoiding using foreach on every email that I get.
Lets start at the bottom:
You dispose your client before you actually finish receiving the HttpResponseMessage asynchronously. You'll need to make your method async Task and await inside:
public async Task SendDataAsync(Email email)
{
using (var client = new HttpClient())
{
var response = await client.PostAsync(email);
}
}
Currently, your SendDataParallel doesn't compile. Again, it needs to return a Task:
public Task SendEmailsAsync(IList<Email> emails)
{
var emailTasks = emails.Select(SendDataAsync);
return Task.WhenAll(allTasks);
}
At the top, you'll need to await on SendEmailsAsync:
public async Task LoadDataAsync()
{
while (Main.IsProcessRunning)
{
var emails = new dummyRepositories().GetAllEmails();
await SendEmailsAsync(emails);
}
}
Edit:
If you're running this inside a windows service, you can offload it to Task.Run and use the async keyword:
var controller = new Controller();
_processThread = Task.Run(async () => await controller.LoadDataAsync());
Doesn't your compiler highlight your code with errors?
If you mark your method as async while it doesn't return any value, you should set your return type as Task, not void:
public async Task SendDataParallel(IList<Email> emails){
var allTasks = emails.Select(SendDataAsync);
await Task.WhenAll(allTasks);
}
Your second method also shoud return a Task, otherwise what you want to (a)wait in the first method?
public async Task SendDataAsync(Email email){
using (var client = new HttpClient())
{
return client.PostAsync(email);
}
}
Now you can Select all your SendDataAsync tasks in SendDataParallel and .Wait() it's task in LoadData in synchronious mode:
public void LoadData(){
while(Main.IsProcessRunning){
var emails = new dummyRepositories().GetAllEmails(); //This will return List<Emails>.
SendDataParallel(emails).Wait();
}
}
More information you can find reading answers in other SO questions and docs on MSDN:
Can somebody please explain async / await?
Brief explanation of Async/Await in .Net 4.5
how to and when use async and await
Asynchronous Programming with Async and Await
And as you used LINQ's Select() which is based on foreach cycle next article also could be useful:
Nested task inside loop

Accepting incoming requests asynchronously

I have identified a bottleneck in my TCP application that I have simplified for the sake of this question.
I have a MyClient class, that represents when a client connects; also I have a MyWrapper class, that represents a client that fulfill some conditions. If a MyClientfulfill some conditions, it qualifies for wrapper.
I want to expose an method that allows the caller to await a MyWrapper, and that method should handle the negotiation and rejection of invalid MyClients:
public static async Task StartAccepting(CancellationToken token)
{
while (!token.IsCancellationRequested)
{
var wrapper = await AcceptWrapperAsync(token);
HandleWrapperAsync(wrapper);
}
}
Therefore AcceptWrapperAsync awaits a valid wrapper, and HandleWrapperAsync handles the wrapper asynchronously without blocking the thread, so AcceptWrapperAsync can get back to work as fast as it can.
How that method works internally is something like this:
public static async Task<MyWrapper> AcceptWrapperAsync(CancellationToken token)
{
while (!token.IsCancellationRequested)
{
var client = await AcceptClientAsync();
if (IsClientWrappable(client))
return new MyWrapper(client);
}
return null;
}
public static async Task<MyClient> AcceptClientAsync()
{
await Task.Delay(1000);
return new MyClient();
}
private static Boolean IsClientWrappable(MyClient client)
{
Thread.Sleep(500);
return true;
}
This code simulates that there is a client connection every second, and that it takes half a second to checkout if the connection is suitable for a wrapper. AcceptWrapperAsync loops till a valid wrapper is generated, and then returns.
This approach, that works well, has a flaw. During the time that IsClientWrappable is executing, no further clients can be accepted, creating a bottleneck when lot of clients are trying to connect at the same time. I am afraid that in real life, if the server goes down while having lot of clients connected, the going up is not gonna be nice because all of them will try to connect at the same time. I know that is very difficult to connect all of them at the same time, but I would like to speed up the connection process.
Making IsClientWrappable async, would just ensure that the executing thread is not blocked till the negotiation finishes, but the execution flow is blocked anyway.
How could I improve this approach to continuously accept new clients but still be able of awaiting a wrapper using AcceptWrapperAsync?
//this loop must never be blocked
while (!token.IsCancellationRequested)
{
var client = await AcceptClientAsync();
HandleClientAsync(client); //must not block
}
Task HandleClientAsync(Client client) {
if (await IsClientWrappableAsync(client)) //make async as well, don't block
await HandleWrapperAsync(new MyWrapper(client));
}
This way you move the IsClientWrappable logic out of the accept loop and into the background async workflow.
If you do not wish to make IsClientWrappable non-blocking, just wrap it with Task.Run. It is essential that HandleClientAsync does not block so that its caller doesn't either.
TPL Dataflow to the rescue. I have created a "producer/consumer" object with two queues that:
accepts inputs from "producer" and stores it in the "in" queue.
a internal asynchronous task read from the "in" queue and process the input in parallel with a given maximum degree of parallelism.
put the processed item in the "out" queue afterwards. Result or Exception.
accepts a consumer to await an item. Then can check if the processing was successful or not.
I have done some testing and it seems to work fine, I want to do more testing though:
public sealed class ProcessingResult<TOut>
where TOut : class
{
public TOut Result { get; internal set; }
public Exception Error { get; internal set; }
}
public abstract class ProcessingBufferBlock<TIn,TOut>
where TIn:class
where TOut:class
{
readonly BufferBlock<TIn> _in;
readonly BufferBlock<ProcessingResult<TOut>> _out;
readonly CancellationToken _cancellation;
readonly SemaphoreSlim _semaphore;
public ProcessingBufferBlock(Int32 boundedCapacity, Int32 degreeOfParalellism, CancellationToken cancellation)
{
_cancellation = cancellation;
_semaphore = new SemaphoreSlim(degreeOfParalellism);
var options = new DataflowBlockOptions() { BoundedCapacity = boundedCapacity, CancellationToken = cancellation };
_in = new BufferBlock<TIn>(options);
_out = new BufferBlock<ProcessingResult<TOut>>(options);
StartReadingAsync();
}
private async Task StartReadingAsync()
{
await Task.Yield();
while (!_cancellation.IsCancellationRequested)
{
var incoming = await _in.ReceiveAsync(_cancellation);
ProcessThroughGateAsync(incoming);
}
}
private async Task ProcessThroughGateAsync(TIn input)
{
_semaphore.Wait(_cancellation);
Exception error=null;
TOut result=null;
try
{
result = await ProcessAsync(input);
}
catch (Exception ex)
{
error = ex;
}
finally
{
if(result!=null || error!=null)
_out.Post(new ProcessingResult<TOut>() { Error = error, Result = result });
_semaphore.Release(1);
}
}
protected abstract Task<TOut> ProcessAsync(TIn input);
public void Post(TIn item)
{
_in.Post(item);
}
public Task<ProcessingResult<TOut>> ReceiveAsync()
{
return _out.ReceiveAsync();
}
}
So the example I used on the OP would be something like this:
public class WrapperProcessingQueue : ProcessingBufferBlock<MyClient, MyWrapper>
{
public WrapperProcessingQueue(Int32 boundedCapacity, Int32 degreeOfParalellism, CancellationToken cancellation)
: base(boundedCapacity, degreeOfParalellism, cancellation)
{ }
protected override async Task<MyWrapper> ProcessAsync(MyClient input)
{
await Task.Delay(5000);
if (input.Id % 3 == 0)
return null;
return new MyWrapper(input);
}
}
And then I could add MyClient objects to that queue as fast as I get them, they would be processed in parallel, and the consumer would await for the ones that pass the filter.
As I said, I want to do more testing but any feedback will be very welcomed.
Cheers.

Categories

Resources