TPL BufferBlock Consume Method Not Being Called - c#

I want to implement the consumer/producer pattern using the BufferBlock that runs continuously similar to the question here and the code here.
I tried to use an ActionBlock like the OP, but if the bufferblock is full and new messages are in it's queue then the new messages never get added to the ConcurrentDictionary _queue.
In the code below the ConsumeAsync method never gets called when a new message is added to the bufferblock with this call:_messageBufferBlock.SendAsync(message)
How can I correct the code below so that the ConsumeAsync method is called every time a new message is added using _messageBufferBlock.SendAsync(message)?
public class PriorityMessageQueue
{
private volatile ConcurrentDictionary<int,MyMessage> _queue = new ConcurrentDictionary<int,MyMessage>();
private volatile BufferBlock<MyMessage> _messageBufferBlock;
private readonly Task<bool> _initializingTask; // not used but allows for calling async method from constructor
private int _dictionaryKey;
public PriorityMessageQueue()
{
_initializingTask = Init();
}
public async Task<bool> EnqueueAsync(MyMessage message)
{
return await _messageBufferBlock.SendAsync(message);
}
private async Task<bool> ConsumeAsync()
{
try
{
// This code does not fire when a new message is added to the buffereblock
while (await _messageBufferBlock.OutputAvailableAsync())
{
// A message object is never received from the bufferblock
var message = await _messageBufferBlock.ReceiveAsync();
}
return true;
}
catch (Exception ex)
{
return false;
}
}
private async Task<bool> Init()
{
var executionDataflowBlockOptions = new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = Environment.ProcessorCount,
BoundedCapacity = 50
};
var prioritizeMessageBlock = new ActionBlock<MyMessage>(msg =>
{
SetMessagePriority(msg);
}, executionDataflowBlockOptions);
_messageBufferBlock = new BufferBlock<MyMessage>();
_messageBufferBlock.LinkTo(prioritizeMessageBlock, new DataflowLinkOptions { PropagateCompletion = true, MaxMessages = 50});
return await ConsumeAsync();
}
}
EDIT
I have removed all the extra code and added comments.

I'm still not completely certain what you're trying to accomplish but I'll try to point you in the right direction. Most of the code in the example isn't strictly necessary.
I need to know when a new message arrives
If this is your only requirement then I'll assume you just need to run some arbitrary code whenever a new message is passed in. The easiest way to do that in dataflow is to use a TransformBlock and set that block as the initial receiver in your pipeline. Each block has it's own buffer so unless you have need for another buffer you can leave it out.
public class PriorityMessageQueue {
private TransformBlock<MyMessage, MyMessage> _messageReciever;
public PriorityMessageQueue() {
var executionDataflowBlockOptions = new ExecutionDataflowBlockOptions {
MaxDegreeOfParallelism = Environment.ProcessorCount,
BoundedCapacity = 50
};
var prioritizeMessageBlock = new ActionBlock<MyMessage>(msg => {
SetMessagePriority(msg);
}, executionDataflowBlockOptions);
_messageReciever = new TransformBlock<MyMessage, MyMessage>(msg => NewMessageRecieved(msg), executionDataflowBlockOptions);
_messageReciever.LinkTo(prioritizeMessageBlock, new DataflowLinkOptions { PropagateCompletion = true });
}
public async Task<bool> EnqueueAsync(MyMessage message) {
return await _messageReciever.SendAsync(message);
}
private MyMessage NewMessageRecieved(MyMessage message) {
//do something when a new message arrives
//pass the message along in the pipeline
return message;
}
private void SetMessagePriority(MyMessage message) {
//Handle a message
}
}
Of course the other option you have would be to do whatever it is you need to immediately within EnqueAsync before returning the task from SendAsync but the TransformBlock gives you extra flexibility.

Related

Adding a listener event to a ConcurrentQueue or ConcurrentBag?

I have multiple tasks that grab messages from a queue in a 1:1 fashion. I want to add these messages from each thread into a ConcurrentBag and process them as they come in asynchronously. The purpose here is to get messages off the queues as quickly as possible so the queues do not fill up. I just need help with a listener that waits until messages are added to the ConcurrentBag then I need to remove the messages out of the Bag and process them
private static ConcurrentQueue<string> messageList = new ConcurrentQueue<string>();
private static readonly SemaphoreSlim semaphore = new SemaphoreSlim(50);
void Main (string[] args)
{
List<Task> taskList = new TaskList();
foreach(var job in JobList)
{
taskList.Add(Task.Run(() => ListenToQueue(job.QueueName));
}
Task.WaitAll(taskList.ToArray());
}
private async Task<string> ListenToQueue(string queueName)
{
var cancellationtoken = new CancellationTokenSource(TimeSpan.FromMinutes(60)).Token
//possibly 5000 messages can be on a single queue
while(!cancellationtoken.IsCancellationRequested)
{
var message = getMessageFromQueue(queueName);
messageList.Enqueue(message); //Add the message from each thread to a thread safe List
}
}
I Need a listener event here that each time something is added to the list then this event will fire. Also I need to remove the message from the list in a thread-safe way.
private void Listener()
{
var msg = string.Empty;
while (messageList.Count > 0)
{
messageList.TryDequeue(out msg)
await semaphore.WaitAsync();
Task.Run(() =>
{
try
{
if(!String.IsNullorEmpty(msg))
{
_ = ProcessMessage(msg); // I do not want to await anything but just fire and let it go
}
}
finally
{
sim.Release();
}
});
}
}
These days, I recommend going with an async-compatible solution like System.Threading.Channels:
private static Channel<string> messageList = Channel.CreateUnbounded<string>();
private async Task<string> ListenToQueue(string queueName)
{
var cancellationtoken = new CancellationTokenSource(TimeSpan.FromMinutes(60)).Token;
try
{
var message = await getMessageFromQueue(queueName, cancellationtoken);
await messageList.Writer.WriteAsync(message, cancellationtoken);
}
catch (OperationCanceledException)
{
// ignored
}
}
private async Task Listener()
{
await foreach (var msg in messageList.Reader.ReadAllAsync())
{
if (!string.IsNullOrEmpty(msg))
_ = Task.Run(() => ProcessMessage(msg));
}
}
But if you want (or need) to stay in the blocking world, there's a solution there, too. ConcurrentBag<T> and ConcurrentQueue<T> are seldom used directly. Instead, it's more common to use BlockingCollection<T>, which wraps a concurrent collection and provides a higher-level API, including GetConsumingEnumerable:
private static BlockingCollection<string> messageList = new();
private async Task<string> ListenToQueue(string queueName)
{
var cancellationtoken = new CancellationTokenSource(TimeSpan.FromMinutes(60)).Token;
try
{
var message = await getMessageFromQueue(queueName, cancellationtoken);
messageList.Add(message, cancellationtoken);
}
catch (OperationCanceledException)
{
// ignored
}
}
private void Listener()
{
foreach (var msg in messageList.GetConsumingEnumerable())
{
if (!string.IsNullOrEmpty(msg))
_ = Task.Run(() => ProcessMessage(msg));
}
}

InteropServices.COMException when adding item to ObservableCollection

I using signal R core to send me a list of messages but this happen
public ObservableCollection<ChatMessage> Messages { get; set; } = new ObservableCollection<ChatMessage>();
public async void InitSignalRAsync()
{
ChatMessage mess = new ChatMessage();
hubConnection = new HubConnectionBuilder().WithUrl("http://localhost:5000/chatHub").Build();
await hubConnection.StartAsync();
hubConnection.On<string, string>("ReceiveMessage", (user, message) =>
{
mess.user = user;
mess.message = message;
Messages.Add(mess);
});
}
I got an error
System.Runtime.InteropServices.COMException:
at my
Messages.Add(mess);
when I receive the data
As Martin noted, you must update ViewModel components from their UI thread.
However, for a solution, I recommend using the more general-purpose SynchronizationContext rather than the UWP-specific Dispatcher class. By using the more general type, your code is more reusable and more testable.
E.g.:
public ObservableCollection<ChatMessage> Messages { get; set; } = new ObservableCollection<ChatMessage>();
public async Task InitSignalRAsync()
{
var context = SynchronizationContext.Current;
hubConnection = new HubConnectionBuilder().WithUrl("http://localhost:5000/chatHub").Build();
await hubConnection.StartAsync();
hubConnection.On<string, string>("ReceiveMessage", (user, message) =>
{
var mess = new ChatMessage
{
user = user,
message = message,
};
context.Post(_ => Messages.Add(mess));
});
}
I also changed your async void to async Task (again, better reusability and testability), and made a new ChatMessage for each chat message, which I believe is the intended behavior.
This is happening because the Add() must run on the UI thread when you are using ObservableCollection. So to make it work, make sure to execute the call in Dispatcher.RunAsync():
Windows.ApplicationModel.Core.CoreApplication.MainView.CoreWindow.Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () => {
Messages.Add(mess);
});

Create delay between two message reads of a Queue?

I am using Azure Queues to perform a bulk import.
I am using WebJobs to perform the process in the background.
The queue dequeues very frequently. How do I create a delay between 2 message
reads?
This is how I am adding a message to the Queue
public async Task<bool> Handle(CreateFileUploadCommand message)
{
var queueClient = _queueService.GetQueueClient(Constants.Queues.ImportQueue);
var brokeredMessage = new BrokeredMessage(JsonConvert.SerializeObject(new ProcessFileUploadMessage
{
TenantId = message.TenantId,
FileExtension = message.FileExtension,
FileName = message.Name,
DeviceId = message.DeviceId,
SessionId = message.SessionId,
UserId = message.UserId,
OutletId = message.OutletId,
CorrelationId = message.CorrelationId,
}))
{
ContentType = "application/json",
};
await queueClient.SendAsync(brokeredMessage);
return true;
}
And Below is the WebJobs Function.
public class Functions
{
private readonly IValueProvider _valueProvider;
public Functions(IValueProvider valueProvider)
{
_valueProvider = valueProvider;
}
public async Task ProcessQueueMessage([ServiceBusTrigger(Constants.Constants.Queues.ImportQueue)] BrokeredMessage message,
TextWriter logger)
{
var queueMessage = message.GetBody<string>();
using (var client = new HttpClient())
{
client.BaseAddress = new Uri(_valueProvider.Get("ServiceBaseUri"));
var stringContent = new StringContent(queueMessage, Encoding.UTF8, "application/json");
var result = await client.PostAsync(RestfulUrls.ImportMenu.ProcessUrl, stringContent);
if (result.IsSuccessStatusCode)
{
await message.CompleteAsync();
}
else
{
await message.AbandonAsync();
}
}
}
}
As far as I know, azure webjobs sdk enable concurrent processing on a single instance(the default is 16).
If you run your webjobs, it will read 16 queue messages(peeklock and calls Complete on the message if the function finishes successfully, or calls Abandon) and create 16 processes to execute the trigger function at same time. So you feel the queue dequeues very frequently.
If you want to disable concurrent processing on a single instance.
I suggest you could set ServiceBusConfiguration's MessageOptions.MaxConcurrentCalls to 1.
More details, you could refer to below codes:
In the program.cs:
JobHostConfiguration config = new JobHostConfiguration();
ServiceBusConfiguration serviceBusConfig = new ServiceBusConfiguration();
serviceBusConfig.MessageOptions.MaxConcurrentCalls = 1;
config.UseServiceBus(serviceBusConfig);
JobHost host = new JobHost(config);
host.RunAndBlock();
If you want to create a delay between 2 message reads, I suggest you could create a custom ServiceBusConfiguration.MessagingProvider.
It contains CompleteProcessingMessageAsync method, this method completes processing of the specified message, after the job function has been invoked.
I suggest you could add thread.sleep method in CompleteProcessingMessageAsync to achieve delay read.
More detail, you could refer to below code sample:
CustomMessagingProvider.cs:
Notice: I override the CompleteProcessingMessageAsync method codes.
public class CustomMessagingProvider : MessagingProvider
{
private readonly ServiceBusConfiguration _config;
public CustomMessagingProvider(ServiceBusConfiguration config)
: base(config)
{
_config = config;
}
public override NamespaceManager CreateNamespaceManager(string connectionStringName = null)
{
// you could return your own NamespaceManager here, which would be used
// globally
return base.CreateNamespaceManager(connectionStringName);
}
public override MessagingFactory CreateMessagingFactory(string entityPath, string connectionStringName = null)
{
// you could return a customized (or new) MessagingFactory here per entity
return base.CreateMessagingFactory(entityPath, connectionStringName);
}
public override MessageProcessor CreateMessageProcessor(string entityPath)
{
// demonstrates how to plug in a custom MessageProcessor
// you could use the global MessageOptions, or use different
// options per entity
return new CustomMessageProcessor(_config.MessageOptions);
}
private class CustomMessageProcessor : MessageProcessor
{
public CustomMessageProcessor(OnMessageOptions messageOptions)
: base(messageOptions)
{
}
public override Task<bool> BeginProcessingMessageAsync(BrokeredMessage message, CancellationToken cancellationToken)
{
// intercept messages before the job function is invoked
return base.BeginProcessingMessageAsync(message, cancellationToken);
}
public override async Task CompleteProcessingMessageAsync(BrokeredMessage message, FunctionResult result, CancellationToken cancellationToken)
{
if (result.Succeeded)
{
if (!MessageOptions.AutoComplete)
{
// AutoComplete is true by default, but if set to false
// we need to complete the message
cancellationToken.ThrowIfCancellationRequested();
await message.CompleteAsync();
Console.WriteLine("Begin sleep");
//Sleep 5 seconds
Thread.Sleep(5000);
Console.WriteLine("Sleep 5 seconds");
}
}
else
{
cancellationToken.ThrowIfCancellationRequested();
await message.AbandonAsync();
}
}
}
}
Program.cs main method:
static void Main()
{
var config = new JobHostConfiguration();
if (config.IsDevelopment)
{
config.UseDevelopmentSettings();
}
var sbConfig = new ServiceBusConfiguration
{
MessageOptions = new OnMessageOptions
{
AutoComplete = false,
MaxConcurrentCalls = 1
}
};
sbConfig.MessagingProvider = new CustomMessagingProvider(sbConfig);
config.UseServiceBus(sbConfig);
var host = new JobHost(config);
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
Result:

Unit testing MassTransit consumers that make utilize asynchronous calls

We are using MassTransit asynchronous messaging (on top of RabbitMQ) for our microservice architecture.
We ran into issues testing consumers that in turn make asynchronous calls.
The example below shows a simple MassTransit consumer that uses RestSharp to make an outbound call and utilized the ExecuteAsync asynchronous method.
public class VerifyPhoneNumberConsumer : Consumes<VerifyPhoneNumber>.Context
{
IRestClient _restClient;
RestRequest _request;
PhoneNumber _phoneNumber;
PhoneNumberVerificationResponse _responseData;
public VerifyPhoneNumberConsumer(IRestClient client)
{
_restClient = client;
}
public void Consume(IConsumeContext<VerifyPhoneNumber> context)
{
try
{
//we can do some standard message verification/validation here
_restClient.ExecuteAsync<PhoneNumberVerificationResponse>(_request, (response) =>
{
//here we might do some standard response verification
_responseData = response.Data;
_phoneNumber = new PhoneNumber()
{
Number = _responseData.PhoneNumber
};
context.Respond(new VerifyPhoneNumberSucceeded(context.Message)
{
PhoneNumber = _phoneNumber
});
});
}
catch (Exception exception)
{
context.Respond(new VerifyPhoneNumberFailed(context.Message)
{
PhoneNumber = context.Message.PhoneNumber,
Message = exception.Message
});
}
}
}
A sample unit test for this might look like the following:
[TestFixture]
public class VerifyPhoneNumberConsumerTests
{
private VerifyPhoneNumberConsumer _consumer;
private PhoneNumber _phoneNumber;
private RestResponse _response;
private VerifyPhoneNumber _command;
private AutoResetEvent _continuationEvent;
private const int CONTINUE_WAIT_TIME = 1000;
[SetUp]
public void Initialize()
{
_continuationEvent = new AutoResetEvent(false);
_mockRestClient = new Mock<IRestClient>();
_consumer = new VerifyPhoneNumberConsumer(_mockRestClient.Object);
_response = new RestResponse();
_response.Content = "Response Test Content";
_phoneNumber = new PhoneNumber()
{
Number = "123456789"
};
_command = new VerifyPhoneNumber(_phoneNumber);
}
[Test]
public void VerifyPhoneNumber_Succeeded()
{
var test = TestFactory.ForConsumer<VerifyPhoneNumberConsumer>().New(x =>
{
x.ConstructUsing(() => _consumer);
x.Send(_command, (scenario, context) => context.SendResponseTo(scenario.Bus));
});
_mockRestClient.Setup(
c =>
c.ExecuteAsync(Moq.It.IsAny<IRestRequest>(),
Moq.It
.IsAny<Action<IRestResponse<PhoneNumberVerificationResponse>, RestRequestAsyncHandle>>()))
.Callback<IRestRequest, Action<IRestResponse<PhoneNumberVerificationResponse>, RestRequestAsyncHandle>>((
request, callback) =>
{
var responseMock = new Mock<IRestResponse<PhoneNumberVerificationResponse>>();
responseMock.Setup(r => r.Data).Returns(GetSuccessfulVericationResponse());
callback(responseMock.Object, null);
_continuationEvent.Set();
});
test.Execute();
_continuationEvent.WaitOne(CONTINUE_WAIT_TIME);
Assert.IsTrue(test.Sent.Any<VerifyPhoneNumberSucceeded>());
}
private PhoneNumberVerificationResponse GetSuccessfulVericationResponse()
{
return new PhoneNumberVerificationResponse
{
PhoneNumber = _phoneNumber
};
}
}
Because of the invocation of the ExecuteAsync method in the consumer, this test method would fall through if we did not put something to block it until it was signaled (or timed out). In the sample above, we are using AutoResetEvent to signal from the callback to continue and run assertions.
THIS IS A TERRIBLE METHOD and we are exhausting all resources to try to find out alternatives. If its not obvious, this can potentially cause false failures and race conditions during testing. Not too mention potentially crippling automated testing times.
What alternatives do we have that are BETTER than what we currently have.
EDIT Here is a source that I originally used for how to mock RestSharp asynchronous calls.
How to test/mock RestSharp ExecuteAsync(...)
Honestly, the complexity of doing asynchronous methods is one of the key drivers of MassTransit 3. While it isn't ready yet, it makes asynchronous method invocation from consumers so much better.
What you're testing above, because you are calling ExecuteAsync() on your REST client, and not waiting for the response (using .Result, or .Wait) in the consumer, the HTTP call is continuing after the message consumer has returned. So that might be part of your problem.
In MT3, this consumer would be written as:
public async Task Consume(ConsumeContext<VerifyPhoneNumber> context)
{
try
{
var response = await _restClient
.ExecuteAsync<PhoneNumberVerificationResponse>(_request);
var phoneNumber = new PhoneNumber()
{
Number = response.PhoneNumber
};
await context.RespondAsync(new VerifyPhoneNumberSucceeded(context.Message)
{
PhoneNumber = _phoneNumber
});
}
catch (Exception exception)
{
context.Respond(new VerifyPhoneNumberFailed(context.Message)
{
PhoneNumber = context.Message.PhoneNumber,
Message = exception.Message
});
}
}
I was able to come up with the following solution which seems far more elegant and proper. Feel free to correct me if I am wrong in assuming this.
I modified the RestSharp execution in my consumer so my consumer looks like the following:
public class VerifyPhoneNumberConsumer : Consumes.Context
{
IRestClient _restClient;
RestRequest _request;
PhoneNumber _phoneNumber;
PhoneNumberVerificationResponse _responseData;
public VerifyPhoneNumberConsumer(IRestClient client)
{
_restClient = client;
}
public void Consume(IConsumeContext<VerifyPhoneNumber> context)
{
try
{
//we can do some standard message verification/validation here
var response = await _restClient.ExecuteGetTaskAsync<PhoneNumberVerificationResponse>(_request);
_responseData = response.Data;
_phoneNumber = new PhoneNumber()
{
Number = _responseData.PhoneNumber
};
}
catch (Exception exception)
{
context.Respond(new VerifyPhoneNumberFailed(context.Message)
{
PhoneNumber = context.Message.PhoneNumber,
Message = exception.Message
});
}
}
}
This utilizes the TPL async capabilities of RestSharp so that I don't have to do it myself.
Because of this, I am able to change my test code to the following:
[Test]
public void VerifyPhoneNumber_Succeeded()
{
var test = TestFactory.ForConsumer<VerifyPhoneNumberConsumer>().New(x =>
{
x.ConstructUsing(() => _consumer);
x.Send(_command, (scenario, context) => context.SendResponseTo(scenario.Bus));
});
var response = (IRestResponse<PhoneNumberVerificationResponse>)new RestResponse<PhoneNumberVerificationResponse>();
response.Data = GetSuccessfulVericationResponse();
var taskResponse = Task.FromResult(response);
Expect.MethodCall(
() => _client.ExecuteGetTaskAsync<PhoneNumberVerificationResponse>(Any<IRestRequest>.Value.AsInterface))
.Returns(taskResponse);
test.Execute();
Assert.IsTrue(test.Sent.Any<VerifyPhoneNumberSucceeded>());
}

TPL dataflow blocks inside a WCF duplex

I am new writer to SO, pls bear with me.
I have a WCF service with a duplex service contract. This service contract has an operation contact that suppose to do long data processing. I am constrained to limit the number of concurrent data processing to let's say max 3. My problem is that after the data processing I need to get back to the same service instance context so I call back my initiator endpoint passing the data processing result. I need to mention that due to various reasons I am constrained to TPL dataflows and WCF duplex.
Here is a demo to what I wrote so far
In a console library I simulate WCF calls
class Program
{
static void Main(string[] args)
{
// simulate service calls
Enumerable.Range(0, 5).ToList().ForEach(x =>
{
new System.Threading.Thread(new ThreadStart(async () =>
{
var service = new Service();
await service.Inc(x);
})).Start();
});
}
}
Here is what suppose to be the WCF service
// service contract
public class Service
{
static TransformBlock<Message<int>, Message<int>> transformBlock;
static Service()
{
transformBlock = new TransformBlock<Message<int>, Message<int>>(x => Inc(x), new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = 3
});
}
static Message<int> Inc(Message<int> input)
{
System.Threading.Thread.Sleep(100);
return new Message<int> { Token = input.Token, Data = input.Data + 1 };
}
// operation contract
public async Task Inc(int id)
{
var token = Guid.NewGuid().ToString();
transformBlock.Post(new Message<int> { Token = token, Data = id });
while (await transformBlock.OutputAvailableAsync())
{
Message<int> message;
if (transformBlock.TryReceive(m => m.Token == token, out message))
{
// do further processing using initiator service instance members
// something like Callback.IncResult(m.Data);
break;
}
}
}
}
public class Message<T>
{
public string Token { get; set; }
public T Data { get; set; }
}
The operation contract is not really necessary to be async, but I needed the OutputAvailableAsync notification.
Is this a good approach or is there a better solution for my scenario?
Thanks in advance.
First, I think you shouldn't use the token the way you do. Unique identifiers are useful when communicating between processes. But when you're inside a single process, just use reference equality.
To actually answer your question, I think the (kind of) busy loop is not a good idea.
A simpler solution for asynchronous throttling would be to use SemaphoreSlim. Something like:
static readonly SemaphoreSlim Semaphore = new SemaphoreSlim(3);
// operation contract
public async Task Inc(int id)
{
await Semaphore.WaitAsync();
try
{
Thread.Sleep(100);
var result = id + 1;
// do further processing using initiator service instance members
// something like Callback.IncResult(result);
}
finally
{
Semaphore.Release();
}
}
If you really want to (or have to?) use dataflow, you can use TaskCompletionSource for synchronization between the operation and the block. The operation method would wait on the Task of the TaskCompletionSource and the block would set it when it finished computation for that message:
private static readonly ActionBlock<Message<int>> Block =
new ActionBlock<Message<int>>(
x => Inc(x),
new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = 3
});
static void Inc(Message<int> input)
{
Thread.Sleep(100);
input.TCS.SetResult(input.Data + 1);
}
// operation contract
public async Task Inc(int id)
{
var tcs = new TaskCompletionSource<int>();
Block.Post(new Message<int> { TCS = tcs, Data = id });
int result = await tcs.Task;
// do further processing using initiator service instance members
// something like Callback.IncResult(result);
}

Categories

Resources