I need to update enqueue times of already scheduled messages in a service bus queue. I tried different approaches, but I wasn't successful at all. I tried peeking messages, then receiving messages that I'm looking for or at least completing that message, but messages can not be completed when we peek at them. Is there maybe any function to get a message by its sequence number or do you have any other approach or solution that could solve this problem?
You cannot update a scheduled message but what you can do is cancel the schedule and re-schedule the same message. This is done with the help of its sequence number.
Below is an example of cancelling a scheduled message:
var queueName = "<queue>";
var connectionString = "<connection-string>";
var client = new ServiceBusClient(connectionString);
var sender = client.CreateSender(queueName);
// Scheduling a new message
var sequenceNumber = await sender.ScheduleMessageAsync(new ServiceBusMessage(), new DateTimeOffset(DateTime.UtcNow.AddMinutes(10)));
// Cancelling the above scheduled message
await sender.CancelScheduledMessageAsync(sequenceNumber);
Now that you cancelled the scheduled message, you can schedule it again with same message details. In your case, you have to peek the message first to get your exact message details like sequence number, message body, etc.
Related
Suppose we have a queue in ActiveMQ, a client that sends messages (producer) to that queue and a server that gets the messages (consumer) from the queue.
On the server side the consumer has a message listener, something like:
consumer.Listener += ConsumerOnListener;
and the implementation of ConsumerOnListener looks like the following:
private void ConsumerOnListener(IMessage message)
{
var textMessage = message as ITextMessage;
// validate textMessage
// more code here...eg. save to databse,logging etc. (part-a)
Task.Factory.StartNew(() =>
{
// do something else here (part-b)
});
}
The main idea behind the above is not to wait for part-b to be executed before processing the next message. Imagine that part-b does something completely of its own which may be succeeded or not(fire-and-forget).
So, the question here is whether is OK or not to use Tasks inside ConsumerOnListener. Will this somehow
"block" the queue?
Assuming that the Task is asynchronous then it shouldn't block either the execution of the listener or the queue itself. Typically concurrency in use-cases like this is increased by simply increasing the number of consumers/listeners, but it's also valid to have listeners kick off their own async threads, tasks, etc.
When receiving mail through MailGun they require a response within a limited time. I have two issues with this:
1) After receiving the message I need to process and record it in my CRM which takes some time. This causes MailGun to time out before I get to send a response. Then MailGun resends the message again and again as it continues to time out.
2) MailGun's post is not async but the api calls to my CRM are async.
So I need to send MailGun a 200 response and then continue to process the message. And that process needs to be in async.
The below code shows what I want to have happen. I tried using tasks and couldn't get it working. There are times when many emails can come in a once (like when initializing someone's account) if the solution requires some sort of parallel tasks or threads it would need to handle many of them.
public class HomeController : Controller
{
[HttpPost]
[Route("mail1")]
public ActionResult Mail()
{
var emailObj = MailGun.Receive(Request);
return Content("ok");
_ = await CRM.SendToEmailApp(emailObj);
}
}
Thank you for the help!
The easiest way to do what you are describing (which is not recommended, because you may lose some results if your app crash) is to use a fire & forget task:
var emailObj = MailGun.Receive(Request);
Task.Run(async () => await CRM.SendToEmailApp(emailObj));
return Content("ok");
But, I think what you really want is sort of a Message Queue, by using a message queue you put the message in the queue (which is fast enough) and return immediately, at the same time a processor is processing the message queue and saves the result in the CRM.
This is what it'll look like when you use a message queueing broker.
I have a long running process which performs matches between millions of records I call this code using a Service Bus, However when my process passes the 5 minute limit Azure starts processing the already processed records from the start again.
How can I avoid this
Here is my code:
private static async Task ProcessMessagesAsync(Message message, CancellationToken token)
{
long receivedMessageTrasactionId = 0;
try
{
IQueueClient queueClient = new QueueClient(serviceBusConnectionString, serviceBusQueueName, ReceiveMode.PeekLock);
// Process the message
receivedMessageTrasactionId = Convert.ToInt64(Encoding.UTF8.GetString(message.Body));
// My Very Long Running Method
await DataCleanse.PerformDataCleanse(receivedMessageTrasactionId);
//Get Transaction and Metric details
await queueClient.CompleteAsync(message.SystemProperties.LockToken);
}
catch (Exception ex)
{
Log4NetErrorLogger(ex);
throw ex;
}
}
Messages are intended for notifications and not long running processing.
You've got a fewoptions:
Receive the message and rely on receiver's RenewLock() operation to extend the lock.
Use user-callback API and specify maximum processing time, if known, via MessageHandlerOptions.MaxAutoRenewDuration setting to auto-renew message's lock.
Record the processing started but do not complete the incoming message. Rather leverage message deferral feature, sending yourself a new delayed message with the reference to the deferred message SequenceNumber. This will allow you to periodically receive a "reminder" message to see if the work is finished. If it is, complete the deferred message by its SequenceNumber. Otherise, complete the "reminder" message along with sending a new one. This approach would require some level of your architecture redesign.
Similar to option 3, but offload processing to an external process that will report the status later. There are frameworks that can help you with that. MassTransit or NServiceBus. The latter has a sample you can download and play with.
Note that option 1 and 2 are not guaranteed as those are client-side initiated operations.
I am using RabbitMQ from last month successfully.Messages are read from queue using BasicConsume feature of RabbitMQ.The messages published to queue is immediately consumed by the corresponding consumer.
Now i created a new queue DelayedMsg,The messages published to this queue has to be read only after 5 min delay.What should i do?
Add a current timestamp value to the message while publishing message to the main queue from publisher / sender. Say example, 'published_on' => 1476424186.
At the consumer side, first check the time difference of current timestamp and published_on.
If the difference found to be less than 5 minutes, then send your message in another queue ( a DLX queue ) with setting expiration time.( use 'expiration' property of amqp message )
This expiration value should be ( current timestamp - published_on ) and it should be in milliseconds.
The message will gets expired in the DLX queue on exact 5 min.
Make sure 'x-dead-letter-exchange' should be your main queue exchange and is bounded with the dlx queue so that when the message gets expired, it will automatically gets queued up back into the main queue. see Dead Letter Exchange for more details.
So, consumer now get the message back after 5 min, process it normally, since its current timestamp and published_on difference will be greater than 5 min.
Try avoiding using DLX to implement delayed messages anymore. It is more like a workaround before having the "RabbitMQ Delayed Message Plugin."
Since now we have this plugin, we should try to use it instead:
https://www.rabbitmq.com/blog/2015/04/16/scheduling-messages-with-rabbitmq/
create 2 queues.
1 is work queue
2 is delay queue
and set delay queue property x-dead-letter-->work queue name,ttl-->5min
send message to delay queue,not need consumer it ,after 5min,this send the message to deal-letter(work queue),so you just consumer the work queue and process it
How do you stop receiving messages from a subscription client set as a an event-driven message pump? I currently have some code that works however when I run two tests consecutively they second breaks. I'm fairly sure messages are still being pulled off the subscription from the first instance i created.
http://msdn.microsoft.com/en-us/library/dn130336.aspx
OnMessageOptions options = new OnMessageOptions();
options.AutoComplete = true; // Indicates if the message-pump should call complete on messages after the callback has completed processing.
options.MaxConcurrentCalls = 1; // Indicates the maximum number of concurrent calls to the callback the pump should initiate
options.ExceptionReceived += LogErrors; // Enables you to be notified of any errors encountered by the message pump
// Start receiveing messages
Client.OnMessage((receivedMessage) => // Initiates the message pump and callback is invoked for each message that is received. Calling Close() on the client will stop the pump.
{
// Process the message
Trace.WriteLine("Processing", receivedMessage.SequenceNumber.ToString());
}, options);
You need to do two things.
First, call subscriptionClient.Close(), which will eventually (but not immediately) stop the message pump.
Second, on your message received callback, check if the client is closed, like so:
if (subscriptionClient.IsClosed)
return;
You can call SubscriptionClient.Close() to stop further messages from being processed.
Also indicated in the comment in the code:
Calling Close() on the client will stop the pump.
I looked and see the exact same behavior. The message pump does NOT stop trying to process messages even when you close the subscription client. If your message processing handler attempts to do a .Complete or .Abandon on the message, it will throw an exception though, because the client is closed. Need a way to stop the pump guys.